Dec 16 06:55:20 crc systemd[1]: Starting Kubernetes Kubelet... Dec 16 06:55:20 crc restorecon[4813]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:20 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:55:21 crc restorecon[4813]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 16 06:55:21 crc kubenswrapper[4823]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:55:21 crc kubenswrapper[4823]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 16 06:55:21 crc kubenswrapper[4823]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:55:21 crc kubenswrapper[4823]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:55:21 crc kubenswrapper[4823]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 16 06:55:21 crc kubenswrapper[4823]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.610719 4823 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613881 4823 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613911 4823 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613917 4823 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613922 4823 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613927 4823 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613933 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613938 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613942 4823 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613947 4823 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613952 4823 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613957 4823 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613961 4823 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613965 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613973 4823 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613978 4823 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613982 4823 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613986 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613990 4823 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613994 4823 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.613999 4823 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614003 4823 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614007 4823 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614010 4823 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614014 4823 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614034 4823 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614038 4823 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614044 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614048 4823 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614052 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614056 4823 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614063 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614069 4823 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614073 4823 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614079 4823 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614084 4823 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614090 4823 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614095 4823 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614101 4823 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614107 4823 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614116 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614120 4823 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614125 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614130 4823 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614135 4823 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614140 4823 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614144 4823 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614149 4823 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614154 4823 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614162 4823 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614167 4823 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614175 4823 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614218 4823 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614223 4823 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614226 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614232 4823 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614238 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614243 4823 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614249 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614252 4823 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614256 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614260 4823 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614266 4823 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614279 4823 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614284 4823 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614288 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614292 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614296 4823 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614300 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614303 4823 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614307 4823 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.614311 4823 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614639 4823 flags.go:64] FLAG: --address="0.0.0.0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614651 4823 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614664 4823 flags.go:64] FLAG: --anonymous-auth="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614670 4823 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614676 4823 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614681 4823 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614688 4823 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614701 4823 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614707 4823 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614712 4823 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614717 4823 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614722 4823 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614727 4823 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614734 4823 flags.go:64] FLAG: --cgroup-root="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614739 4823 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614743 4823 flags.go:64] FLAG: --client-ca-file="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614752 4823 flags.go:64] FLAG: --cloud-config="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614757 4823 flags.go:64] FLAG: --cloud-provider="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614762 4823 flags.go:64] FLAG: --cluster-dns="[]" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614769 4823 flags.go:64] FLAG: --cluster-domain="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614774 4823 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614778 4823 flags.go:64] FLAG: --config-dir="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614783 4823 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614788 4823 flags.go:64] FLAG: --container-log-max-files="5" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614799 4823 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614804 4823 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614808 4823 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614815 4823 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614820 4823 flags.go:64] FLAG: --contention-profiling="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614824 4823 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614829 4823 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614908 4823 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614935 4823 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614951 4823 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614960 4823 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614969 4823 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614977 4823 flags.go:64] FLAG: --enable-load-reader="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614985 4823 flags.go:64] FLAG: --enable-server="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.614993 4823 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615010 4823 flags.go:64] FLAG: --event-burst="100" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615018 4823 flags.go:64] FLAG: --event-qps="50" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615062 4823 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615070 4823 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615077 4823 flags.go:64] FLAG: --eviction-hard="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615108 4823 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615677 4823 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615723 4823 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615744 4823 flags.go:64] FLAG: --eviction-soft="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615760 4823 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615772 4823 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615785 4823 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615796 4823 flags.go:64] FLAG: --experimental-mounter-path="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615807 4823 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615817 4823 flags.go:64] FLAG: --fail-swap-on="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615826 4823 flags.go:64] FLAG: --feature-gates="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615842 4823 flags.go:64] FLAG: --file-check-frequency="20s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615853 4823 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615866 4823 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615876 4823 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615889 4823 flags.go:64] FLAG: --healthz-port="10248" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615901 4823 flags.go:64] FLAG: --help="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615911 4823 flags.go:64] FLAG: --hostname-override="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615920 4823 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615930 4823 flags.go:64] FLAG: --http-check-frequency="20s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615939 4823 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615949 4823 flags.go:64] FLAG: --image-credential-provider-config="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615959 4823 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615972 4823 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615982 4823 flags.go:64] FLAG: --image-service-endpoint="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.615992 4823 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616002 4823 flags.go:64] FLAG: --kube-api-burst="100" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616011 4823 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616052 4823 flags.go:64] FLAG: --kube-api-qps="50" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616063 4823 flags.go:64] FLAG: --kube-reserved="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616072 4823 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616082 4823 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616093 4823 flags.go:64] FLAG: --kubelet-cgroups="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616103 4823 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616112 4823 flags.go:64] FLAG: --lock-file="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616122 4823 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616133 4823 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616144 4823 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616163 4823 flags.go:64] FLAG: --log-json-split-stream="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616173 4823 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616182 4823 flags.go:64] FLAG: --log-text-split-stream="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616191 4823 flags.go:64] FLAG: --logging-format="text" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616201 4823 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616211 4823 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616223 4823 flags.go:64] FLAG: --manifest-url="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616233 4823 flags.go:64] FLAG: --manifest-url-header="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616249 4823 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616260 4823 flags.go:64] FLAG: --max-open-files="1000000" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616273 4823 flags.go:64] FLAG: --max-pods="110" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616282 4823 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616292 4823 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616301 4823 flags.go:64] FLAG: --memory-manager-policy="None" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616314 4823 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616334 4823 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616355 4823 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616368 4823 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616415 4823 flags.go:64] FLAG: --node-status-max-images="50" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616427 4823 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616438 4823 flags.go:64] FLAG: --oom-score-adj="-999" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616448 4823 flags.go:64] FLAG: --pod-cidr="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616457 4823 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616476 4823 flags.go:64] FLAG: --pod-manifest-path="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616486 4823 flags.go:64] FLAG: --pod-max-pids="-1" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616496 4823 flags.go:64] FLAG: --pods-per-core="0" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616505 4823 flags.go:64] FLAG: --port="10250" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616514 4823 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616523 4823 flags.go:64] FLAG: --provider-id="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616532 4823 flags.go:64] FLAG: --qos-reserved="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616542 4823 flags.go:64] FLAG: --read-only-port="10255" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616551 4823 flags.go:64] FLAG: --register-node="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616561 4823 flags.go:64] FLAG: --register-schedulable="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616570 4823 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616590 4823 flags.go:64] FLAG: --registry-burst="10" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616600 4823 flags.go:64] FLAG: --registry-qps="5" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616609 4823 flags.go:64] FLAG: --reserved-cpus="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616620 4823 flags.go:64] FLAG: --reserved-memory="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616633 4823 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616644 4823 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616654 4823 flags.go:64] FLAG: --rotate-certificates="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616663 4823 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616672 4823 flags.go:64] FLAG: --runonce="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616681 4823 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616691 4823 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616703 4823 flags.go:64] FLAG: --seccomp-default="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616712 4823 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616721 4823 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616730 4823 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616741 4823 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616751 4823 flags.go:64] FLAG: --storage-driver-password="root" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616760 4823 flags.go:64] FLAG: --storage-driver-secure="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616769 4823 flags.go:64] FLAG: --storage-driver-table="stats" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616778 4823 flags.go:64] FLAG: --storage-driver-user="root" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616787 4823 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616797 4823 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616807 4823 flags.go:64] FLAG: --system-cgroups="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616817 4823 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616832 4823 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616844 4823 flags.go:64] FLAG: --tls-cert-file="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616854 4823 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616871 4823 flags.go:64] FLAG: --tls-min-version="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616881 4823 flags.go:64] FLAG: --tls-private-key-file="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616892 4823 flags.go:64] FLAG: --topology-manager-policy="none" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616902 4823 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616911 4823 flags.go:64] FLAG: --topology-manager-scope="container" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616921 4823 flags.go:64] FLAG: --v="2" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616935 4823 flags.go:64] FLAG: --version="false" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616948 4823 flags.go:64] FLAG: --vmodule="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616960 4823 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.616969 4823 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617344 4823 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617357 4823 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617369 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617377 4823 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617386 4823 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617394 4823 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617402 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617412 4823 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617420 4823 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617428 4823 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617435 4823 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617444 4823 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617451 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617460 4823 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617469 4823 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617477 4823 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617486 4823 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617494 4823 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617503 4823 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617511 4823 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617519 4823 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617527 4823 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617535 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617542 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617553 4823 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617561 4823 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617568 4823 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617577 4823 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617584 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617592 4823 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617600 4823 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617607 4823 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617618 4823 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617626 4823 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617633 4823 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617641 4823 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617652 4823 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617664 4823 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617675 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617684 4823 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617693 4823 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617701 4823 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617710 4823 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617718 4823 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617726 4823 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617734 4823 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617742 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617750 4823 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617757 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617765 4823 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617773 4823 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617781 4823 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617789 4823 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617797 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617804 4823 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617812 4823 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617820 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617828 4823 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617836 4823 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617844 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617852 4823 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617860 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617872 4823 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617884 4823 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617897 4823 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617916 4823 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617935 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617946 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617957 4823 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617967 4823 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.617976 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.618049 4823 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.627283 4823 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.627325 4823 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627416 4823 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627427 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627434 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627439 4823 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627444 4823 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627449 4823 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627453 4823 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627459 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627463 4823 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627469 4823 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627473 4823 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627478 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627483 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627488 4823 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627492 4823 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627496 4823 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627500 4823 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627503 4823 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627507 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627512 4823 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627516 4823 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627520 4823 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627524 4823 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627528 4823 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627532 4823 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627536 4823 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627539 4823 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627543 4823 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627547 4823 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627554 4823 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627558 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627562 4823 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627566 4823 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627570 4823 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627574 4823 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627580 4823 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627584 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627588 4823 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627592 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627597 4823 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627602 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627607 4823 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627612 4823 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627616 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627620 4823 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627624 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627628 4823 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627632 4823 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627637 4823 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627641 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627646 4823 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627652 4823 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627657 4823 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627662 4823 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627666 4823 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627670 4823 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627675 4823 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627679 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627683 4823 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627686 4823 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627690 4823 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627694 4823 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627698 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627701 4823 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627705 4823 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627710 4823 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627714 4823 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627717 4823 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627721 4823 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627725 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627729 4823 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.627736 4823 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627886 4823 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627892 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627896 4823 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627901 4823 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627905 4823 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627908 4823 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627914 4823 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627917 4823 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627923 4823 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627927 4823 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627931 4823 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627935 4823 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627939 4823 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627943 4823 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627947 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627951 4823 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627955 4823 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627960 4823 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627965 4823 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627968 4823 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627972 4823 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627976 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627980 4823 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627984 4823 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627988 4823 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627992 4823 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.627996 4823 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628000 4823 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628004 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628009 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628013 4823 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628017 4823 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628046 4823 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628052 4823 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628056 4823 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628060 4823 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628064 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628069 4823 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628073 4823 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628077 4823 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628081 4823 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628086 4823 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628091 4823 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628096 4823 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628100 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628105 4823 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628109 4823 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628113 4823 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628118 4823 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628122 4823 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628127 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628131 4823 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628136 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628140 4823 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628145 4823 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628150 4823 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628155 4823 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628159 4823 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628163 4823 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628167 4823 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628171 4823 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628174 4823 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628178 4823 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628182 4823 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628187 4823 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628192 4823 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628196 4823 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628199 4823 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628203 4823 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628207 4823 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.628210 4823 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.628217 4823 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.628404 4823 server.go:940] "Client rotation is on, will bootstrap in background" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.631123 4823 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.631212 4823 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.631774 4823 server.go:997] "Starting client certificate rotation" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.631798 4823 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.632284 4823 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 01:00:44.707036725 +0000 UTC Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.632427 4823 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 570h5m23.07461467s for next certificate rotation Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.636513 4823 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.643469 4823 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.654167 4823 log.go:25] "Validated CRI v1 runtime API" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.667510 4823 log.go:25] "Validated CRI v1 image API" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.669290 4823 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.672463 4823 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-16-06-46-34-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.672501 4823 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.693558 4823 manager.go:217] Machine: {Timestamp:2025-12-16 06:55:21.691381066 +0000 UTC m=+0.179947229 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b35231f6-d02a-487d-8117-57547d768cbe BootID:2caa91d7-bd83-4de0-9038-0514886c6d71 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d3:b2:eb Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d3:b2:eb Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:25:f7:d0 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:57:72:ff Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f9:eb:4c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:48:bc:87 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ea:87:7d Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:11:c2:b4 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:b7:dd:6e:df:3b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:72:df:c8:c8:41:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.694667 4823 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.695061 4823 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.695960 4823 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.696194 4823 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.696249 4823 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.696473 4823 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.696483 4823 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.696685 4823 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.696723 4823 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.697119 4823 state_mem.go:36] "Initialized new in-memory state store" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.697573 4823 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.698214 4823 kubelet.go:418] "Attempting to sync node with API server" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.698234 4823 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.698259 4823 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.698275 4823 kubelet.go:324] "Adding apiserver pod source" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.698287 4823 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.700635 4823 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.700989 4823 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.701839 4823 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702442 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702466 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702474 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702481 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702492 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702500 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702508 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702519 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702528 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702539 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702548 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702556 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.702712 4823 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.702940 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.703014 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.703139 4823 server.go:1280] "Started kubelet" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.703314 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.703350 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:21 crc systemd[1]: Started Kubernetes Kubelet. Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.706182 4823 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.706966 4823 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.707332 4823 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.714253 4823 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.714273 4823 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.714329 4823 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.714751 4823 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 05:57:18.720168013 +0000 UTC Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.714916 4823 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.715034 4823 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.715060 4823 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.715128 4823 server.go:460] "Adding debug handlers to kubelet server" Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.715090 4823 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.180:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18819fb3624cf80e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 06:55:21.703118862 +0000 UTC m=+0.191684985,LastTimestamp:2025-12-16 06:55:21.703118862 +0000 UTC m=+0.191684985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.715483 4823 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.715907 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.716003 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.717241 4823 factory.go:55] Registering systemd factory Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.717306 4823 factory.go:221] Registration of the systemd container factory successfully Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.718325 4823 factory.go:153] Registering CRI-O factory Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.718353 4823 factory.go:221] Registration of the crio container factory successfully Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.718441 4823 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.718472 4823 factory.go:103] Registering Raw factory Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.718493 4823 manager.go:1196] Started watching for new ooms in manager Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.723912 4823 manager.go:319] Starting recovery of all containers Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.724196 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="200ms" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727830 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727880 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727892 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727949 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727960 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727972 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727981 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.727992 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728005 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728016 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728040 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728051 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728060 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728076 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728091 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728104 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728115 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728127 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728136 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728147 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728159 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728169 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728180 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728190 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728203 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728213 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728251 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728262 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728273 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728285 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728296 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728306 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728319 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728331 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728340 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728351 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728362 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728372 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728382 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728394 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728404 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728414 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728425 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728439 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728449 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728460 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728471 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728483 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728493 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728504 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728515 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728526 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728542 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728553 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728566 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728577 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728588 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728597 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728608 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728618 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728629 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728641 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728652 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728664 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728675 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728685 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728695 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728707 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728719 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728730 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728740 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728750 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728761 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728772 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728782 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728795 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728805 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728817 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728828 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728839 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728850 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728861 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728871 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728882 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728893 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728934 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728946 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728958 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728968 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728980 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.728991 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729003 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729013 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729037 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729048 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729057 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729066 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729078 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729090 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729100 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729109 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729118 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729128 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729137 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729152 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729162 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729175 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729191 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729201 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729212 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729731 4823 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729754 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729765 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729776 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729787 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729796 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729804 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729813 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729822 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729831 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729839 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729847 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729855 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729866 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729875 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729884 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729895 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729904 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729921 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729931 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729940 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729949 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729958 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729969 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729979 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729987 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.729999 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730009 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730032 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730045 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730055 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730065 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730079 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730093 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730108 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730122 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730132 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730142 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730151 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730162 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730174 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730184 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730196 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730207 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730219 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730232 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730243 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730252 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730263 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730275 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730285 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730295 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730305 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730313 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730323 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730331 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730340 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730349 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730360 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730369 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730378 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730388 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730398 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730408 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730419 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730427 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730436 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730445 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730454 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730462 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730471 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730513 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730525 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730535 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730546 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730557 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730571 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730580 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730593 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730607 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730623 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730635 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730648 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730661 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730673 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730693 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730707 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730719 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730736 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730749 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730764 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730778 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730815 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730833 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730848 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730863 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730882 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730896 4823 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730910 4823 reconstruct.go:97] "Volume reconstruction finished" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.730920 4823 reconciler.go:26] "Reconciler: start to sync state" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.740836 4823 manager.go:324] Recovery completed Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.761879 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.764582 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.764650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.764664 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.765903 4823 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.765926 4823 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.765946 4823 state_mem.go:36] "Initialized new in-memory state store" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.767301 4823 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.770236 4823 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.770302 4823 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.770347 4823 kubelet.go:2335] "Starting kubelet main sync loop" Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.770415 4823 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 06:55:21 crc kubenswrapper[4823]: W1216 06:55:21.783491 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.783620 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.784220 4823 policy_none.go:49] "None policy: Start" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.785545 4823 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.785601 4823 state_mem.go:35] "Initializing new in-memory state store" Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.815067 4823 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.844489 4823 manager.go:334] "Starting Device Plugin manager" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.844568 4823 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.844581 4823 server.go:79] "Starting device plugin registration server" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.845150 4823 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.845171 4823 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.845428 4823 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.845585 4823 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.845598 4823 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.852464 4823 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.870831 4823 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.870956 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.872518 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.872558 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.872570 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.872690 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.873219 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.873280 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.873597 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.873658 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.873671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.873841 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.874055 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.874116 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.874357 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.874381 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.874395 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875282 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875304 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875315 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875372 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875394 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875406 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875562 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875752 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.875844 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.876574 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.876662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.876724 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.876888 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877017 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877058 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877506 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877579 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877634 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877909 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.877931 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.878050 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.878060 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.878156 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.878241 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.878496 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.878573 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.879259 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.879278 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.879288 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.924934 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="400ms" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933244 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933292 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933321 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933347 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933373 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933395 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933419 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933459 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933496 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933527 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933552 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933576 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933597 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933620 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.933642 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.945496 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.946894 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.946927 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.946940 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:21 crc kubenswrapper[4823]: I1216 06:55:21.946979 4823 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:55:21 crc kubenswrapper[4823]: E1216 06:55:21.947553 4823 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.034893 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.034973 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.034998 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035039 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035065 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035086 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035109 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035132 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035164 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035187 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035211 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035223 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035310 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035260 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035391 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035379 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035271 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035322 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035420 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035421 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035428 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035440 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035383 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035347 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035501 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035544 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035513 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035546 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035589 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.035514 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.148716 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.149900 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.149945 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.149958 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.149989 4823 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:55:22 crc kubenswrapper[4823]: E1216 06:55:22.150580 4823 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.204168 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.211509 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.233822 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.235136 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d1bc41cc1d96d85fa2a7ee4ace616a7a305f197916c79d97bb9905132db79e52 WatchSource:0}: Error finding container d1bc41cc1d96d85fa2a7ee4ace616a7a305f197916c79d97bb9905132db79e52: Status 404 returned error can't find the container with id d1bc41cc1d96d85fa2a7ee4ace616a7a305f197916c79d97bb9905132db79e52 Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.236528 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e07ace9f0028f648ee8457cfd65a16c7d2cd6b3e80041cdd5d15c87f431eda7e WatchSource:0}: Error finding container e07ace9f0028f648ee8457cfd65a16c7d2cd6b3e80041cdd5d15c87f431eda7e: Status 404 returned error can't find the container with id e07ace9f0028f648ee8457cfd65a16c7d2cd6b3e80041cdd5d15c87f431eda7e Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.244315 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.247892 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0b80f3a99b88c9f623607c76424edce49aa8a9137cc5e229ec3d7c8eb993604e WatchSource:0}: Error finding container 0b80f3a99b88c9f623607c76424edce49aa8a9137cc5e229ec3d7c8eb993604e: Status 404 returned error can't find the container with id 0b80f3a99b88c9f623607c76424edce49aa8a9137cc5e229ec3d7c8eb993604e Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.251142 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.287205 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7c3e5377b263ad1e6545ef38d0fad4fad62bac6f3e805c284f584ffaedd6a578 WatchSource:0}: Error finding container 7c3e5377b263ad1e6545ef38d0fad4fad62bac6f3e805c284f584ffaedd6a578: Status 404 returned error can't find the container with id 7c3e5377b263ad1e6545ef38d0fad4fad62bac6f3e805c284f584ffaedd6a578 Dec 16 06:55:22 crc kubenswrapper[4823]: E1216 06:55:22.326061 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="800ms" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.551071 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.552777 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.552812 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.552822 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.552846 4823 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:55:22 crc kubenswrapper[4823]: E1216 06:55:22.553372 4823 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.622545 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:22 crc kubenswrapper[4823]: E1216 06:55:22.622676 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.714995 4823 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 06:09:34.649634838 +0000 UTC Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.715517 4823 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 767h14m11.934127692s for next certificate rotation Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.715334 4823 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.721640 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:22 crc kubenswrapper[4823]: E1216 06:55:22.721755 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.776008 4823 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9" exitCode=0 Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.776114 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.776229 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d1bc41cc1d96d85fa2a7ee4ace616a7a305f197916c79d97bb9905132db79e52"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.776332 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.777499 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.777528 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c3e5377b263ad1e6545ef38d0fad4fad62bac6f3e805c284f584ffaedd6a578"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.777982 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.778005 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.778015 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.780350 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d" exitCode=0 Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.780392 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.780407 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"464e8f3f83c3ab3d666b3684b290c943ad753ce09c57ea2130ea19b21d3603b2"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.780478 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.781149 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.781171 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.781202 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.782073 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.782787 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.782804 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.782813 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.784273 4823 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="173f7590ff6fb04354f0bebec8ba8bbe2d4254d9c535920b0d9adf13a8d72393" exitCode=0 Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.784320 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"173f7590ff6fb04354f0bebec8ba8bbe2d4254d9c535920b0d9adf13a8d72393"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.784342 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0b80f3a99b88c9f623607c76424edce49aa8a9137cc5e229ec3d7c8eb993604e"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.784402 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.785189 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.785210 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.785221 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.786490 4823 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee" exitCode=0 Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.786515 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.786530 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e07ace9f0028f648ee8457cfd65a16c7d2cd6b3e80041cdd5d15c87f431eda7e"} Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.786600 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.787501 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.787526 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:22 crc kubenswrapper[4823]: I1216 06:55:22.787534 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:22 crc kubenswrapper[4823]: W1216 06:55:22.986580 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:22 crc kubenswrapper[4823]: E1216 06:55:22.986653 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:23 crc kubenswrapper[4823]: E1216 06:55:23.127091 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="1.6s" Dec 16 06:55:23 crc kubenswrapper[4823]: W1216 06:55:23.352414 4823 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Dec 16 06:55:23 crc kubenswrapper[4823]: E1216 06:55:23.352946 4823 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.355282 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.360361 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.360431 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.360445 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.360492 4823 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:55:23 crc kubenswrapper[4823]: E1216 06:55:23.362364 4823 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.794654 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.794717 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.794735 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.794750 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.796390 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"461527ca73f110395f3690515a2bba303b648297d2e25f5174a6dc4a9a5b591a"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.796486 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.797418 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.797459 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.797473 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.797738 4823 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410" exitCode=0 Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.797799 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.797943 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.798689 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.798715 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.798724 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.800499 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.800526 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.800539 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.800621 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.802615 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.802637 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.802649 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.807785 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.807812 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.807824 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b"} Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.807893 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.808563 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.808586 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:23 crc kubenswrapper[4823]: I1216 06:55:23.808595 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.815677 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.815620 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41"} Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.818405 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.818450 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.818462 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.821686 4823 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f" exitCode=0 Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.821799 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.822329 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f"} Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.822444 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.823172 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.823198 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.823208 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.823711 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.823733 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.823742 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.912925 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.913204 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.914654 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.914734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.914749 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.962882 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.964293 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.964344 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.964360 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:24 crc kubenswrapper[4823]: I1216 06:55:24.964396 4823 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.828984 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624"} Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.829048 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396"} Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.829069 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6"} Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.829086 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537"} Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.829111 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.829174 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.830452 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.830489 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:25 crc kubenswrapper[4823]: I1216 06:55:25.830502 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.019698 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.836341 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea"} Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.836438 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.836498 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.836515 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.837979 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.838005 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.838015 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.838027 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.838060 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:26 crc kubenswrapper[4823]: I1216 06:55:26.838095 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:27 crc kubenswrapper[4823]: I1216 06:55:27.838683 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:27 crc kubenswrapper[4823]: I1216 06:55:27.840099 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:27 crc kubenswrapper[4823]: I1216 06:55:27.840163 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:27 crc kubenswrapper[4823]: I1216 06:55:27.840180 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.256490 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.256798 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.258280 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.258325 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.258339 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.465470 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.465658 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.465701 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.466760 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.466804 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.466814 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.682303 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.840797 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.841754 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.841793 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:28 crc kubenswrapper[4823]: I1216 06:55:28.841806 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.685110 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.685301 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.686584 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.686631 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.686649 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.689206 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.846942 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.847807 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.847836 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.847846 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.930663 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.930858 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.933091 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.933147 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:30 crc kubenswrapper[4823]: I1216 06:55:30.933161 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:31 crc kubenswrapper[4823]: I1216 06:55:31.446428 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:31 crc kubenswrapper[4823]: I1216 06:55:31.849022 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:55:31 crc kubenswrapper[4823]: I1216 06:55:31.849095 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:31 crc kubenswrapper[4823]: I1216 06:55:31.849945 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:31 crc kubenswrapper[4823]: I1216 06:55:31.849990 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:31 crc kubenswrapper[4823]: I1216 06:55:31.850001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:31 crc kubenswrapper[4823]: E1216 06:55:31.852593 4823 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 06:55:32 crc kubenswrapper[4823]: I1216 06:55:32.192459 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:32 crc kubenswrapper[4823]: I1216 06:55:32.196267 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:32 crc kubenswrapper[4823]: I1216 06:55:32.852247 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:32 crc kubenswrapper[4823]: I1216 06:55:32.853222 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:32 crc kubenswrapper[4823]: I1216 06:55:32.853252 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:32 crc kubenswrapper[4823]: I1216 06:55:32.853261 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.047951 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.048189 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.049400 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.049446 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.049455 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.716255 4823 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.854432 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.855374 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.855422 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:33 crc kubenswrapper[4823]: I1216 06:55:33.855433 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:34 crc kubenswrapper[4823]: I1216 06:55:34.195719 4823 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 06:55:34 crc kubenswrapper[4823]: I1216 06:55:34.195776 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 06:55:34 crc kubenswrapper[4823]: I1216 06:55:34.202754 4823 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 06:55:34 crc kubenswrapper[4823]: I1216 06:55:34.202819 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 06:55:34 crc kubenswrapper[4823]: I1216 06:55:34.447179 4823 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:34 crc kubenswrapper[4823]: I1216 06:55:34.447255 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:36 crc kubenswrapper[4823]: I1216 06:55:36.883238 4823 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 06:55:36 crc kubenswrapper[4823]: I1216 06:55:36.883369 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.468349 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.468518 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.468817 4823 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.468876 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.469644 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.469671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.469680 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.475362 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.871243 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.871730 4823 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.871861 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.872202 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.872480 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:38 crc kubenswrapper[4823]: I1216 06:55:38.872563 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.203321 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.205045 4823 trace.go:236] Trace[1029496779]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:55:25.956) (total time: 13248ms): Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[1029496779]: ---"Objects listed" error: 13248ms (06:55:39.204) Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[1029496779]: [13.248901657s] [13.248901657s] END Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.205067 4823 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.205658 4823 trace.go:236] Trace[1950597445]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:55:25.195) (total time: 14010ms): Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[1950597445]: ---"Objects listed" error: 14010ms (06:55:39.205) Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[1950597445]: [14.010502212s] [14.010502212s] END Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.205844 4823 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.205869 4823 trace.go:236] Trace[2051605732]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:55:24.612) (total time: 14593ms): Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[2051605732]: ---"Objects listed" error: 14593ms (06:55:39.205) Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[2051605732]: [14.593636328s] [14.593636328s] END Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.206024 4823 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.206078 4823 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.206958 4823 trace.go:236] Trace[1206876493]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:55:24.810) (total time: 14396ms): Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[1206876493]: ---"Objects listed" error: 14396ms (06:55:39.206) Dec 16 06:55:39 crc kubenswrapper[4823]: Trace[1206876493]: [14.396762231s] [14.396762231s] END Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.206985 4823 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.209057 4823 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.708415 4823 apiserver.go:52] "Watching apiserver" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.711348 4823 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.711671 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hr8h5","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.712128 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.712190 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.712259 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.712366 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.712373 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.712465 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.712599 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.712787 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.712856 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.713278 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.714832 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.715645 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.716235 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.717438 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.717528 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.717674 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.717878 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.718798 4823 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.721967 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.722233 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.722359 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.722472 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.722883 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.736429 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.751894 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.761721 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.773187 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.792217 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.801399 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.808403 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.808450 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.808489 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.808711 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.808741 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.809121 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.809141 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.809266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.809367 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.809380 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.809428 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.810079 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.810441 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.810493 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.810537 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.810561 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.811000 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.811511 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.811796 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812052 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812084 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812405 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812611 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812645 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812925 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.812963 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813003 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813316 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813336 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813431 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813555 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813353 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813605 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813663 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813697 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813717 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813694 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813780 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813799 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813815 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814493 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814518 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.813849 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814823 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814808 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814077 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814243 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814434 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814446 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.815125 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.814732 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.815478 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.815737 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.815160 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816224 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816340 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816453 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816551 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816650 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816741 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816839 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816238 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816390 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816526 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816911 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816944 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817278 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817297 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817316 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817334 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.816921 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817352 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817370 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817388 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817404 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817421 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817438 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817457 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817473 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817497 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817515 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817529 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817546 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817564 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817581 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817599 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817616 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817632 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817652 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817669 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817686 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817703 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817720 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817739 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817768 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817789 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817808 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817825 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817842 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817859 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817876 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817910 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817927 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817944 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817961 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817976 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.817992 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818008 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818040 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818056 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818071 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818087 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818102 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818118 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818137 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818153 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818170 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818190 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818210 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818230 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818253 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818273 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818294 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818311 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818328 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818343 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818359 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818388 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818403 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818422 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818439 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818454 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818470 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818487 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818504 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818524 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818540 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818555 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818572 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818608 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818637 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818659 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818678 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818739 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818762 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818783 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818802 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818820 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818842 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818865 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818886 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818909 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819132 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819165 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819188 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819209 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819236 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819263 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819283 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819304 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819325 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819346 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819367 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819389 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819409 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819435 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819484 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819502 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819519 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819536 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819553 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819570 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819587 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819603 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819619 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819661 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819676 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819692 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819708 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819723 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819741 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819759 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819824 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819841 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819858 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819873 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819909 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819927 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819944 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819963 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.819985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820014 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820059 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820224 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820248 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820266 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820283 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820431 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820460 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820484 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820506 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820532 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820557 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820579 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820668 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820701 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820721 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820740 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821311 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821363 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821387 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821420 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821450 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821488 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821512 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821539 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821562 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821586 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821610 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821633 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821652 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821678 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821758 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821786 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.828766 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829322 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829403 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818479 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.818877 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820298 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820402 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.820440 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821630 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.821923 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.822421 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.822711 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.822728 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.822975 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.823095 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.823509 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.823944 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.824236 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.824475 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.824663 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.824703 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.824993 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.825015 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.825080 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.825158 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.825302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.825813 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.826262 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.826640 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.826704 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.826873 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.835906 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.826919 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.826900 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827075 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827393 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827412 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827455 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827729 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827905 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.827948 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.828433 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.828600 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829257 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829491 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829895 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829944 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829959 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.830261 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.830437 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.830848 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.830942 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.831168 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.831493 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.831613 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:55:40.331567687 +0000 UTC m=+18.820133810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.831744 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.832026 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.832628 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.832868 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.832868 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.829994 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.832861 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833086 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833181 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833565 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833484 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833543 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833620 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833617 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.833646 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.831077 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.835441 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.835976 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.836676 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837045 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837069 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837222 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837359 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837368 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837378 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837525 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837707 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.837824 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d31d032-9142-4e26-9f06-e3a5ea73d530-hosts-file\") pod \"node-resolver-hr8h5\" (UID: \"2d31d032-9142-4e26-9f06-e3a5ea73d530\") " pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.838280 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.838335 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.838391 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.838402 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.838407 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.838735 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.839137 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.839814 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.839799 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840354 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840407 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840455 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840517 4823 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840733 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sdtm\" (UniqueName: \"kubernetes.io/projected/2d31d032-9142-4e26-9f06-e3a5ea73d530-kube-api-access-7sdtm\") pod \"node-resolver-hr8h5\" (UID: \"2d31d032-9142-4e26-9f06-e3a5ea73d530\") " pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840773 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840797 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840827 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840858 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840885 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.841044 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.841064 4823 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.840306 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.844192 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.844346 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:40.3443193 +0000 UTC m=+18.832885423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.844419 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.844749 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.844756 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.844942 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845016 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845101 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845170 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845231 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845298 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845353 4823 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845418 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845499 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845564 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845642 4823 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845726 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845796 4823 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845864 4823 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845922 4823 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845987 4823 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846081 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846157 4823 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846235 4823 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846299 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846357 4823 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846410 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846469 4823 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846529 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846587 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846651 4823 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846694 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846714 4823 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846764 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846778 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846793 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846804 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846815 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846826 4823 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846838 4823 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846850 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846861 4823 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846872 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846886 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846900 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846934 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846945 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846956 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846967 4823 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846987 4823 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847000 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847042 4823 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847052 4823 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847062 4823 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847073 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847083 4823 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847094 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847104 4823 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847114 4823 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847125 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847135 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847147 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847158 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847169 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847179 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847189 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847199 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847209 4823 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847218 4823 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847230 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847241 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847250 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847261 4823 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847271 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847279 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847291 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847301 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847310 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847320 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847329 4823 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847340 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847349 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847358 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847366 4823 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847376 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847385 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847395 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847410 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847419 4823 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847428 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847439 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847448 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847457 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847469 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847479 4823 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847487 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847496 4823 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847505 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847515 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847527 4823 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847583 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847594 4823 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847603 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847614 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847623 4823 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847632 4823 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847642 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847652 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847663 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847674 4823 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847684 4823 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847695 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847706 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847717 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847727 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847736 4823 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847746 4823 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847757 4823 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847766 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.848759 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.844937 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.845005 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.850701 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:40.350674029 +0000 UTC m=+18.839240332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.850647 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.843924 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845075 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845172 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.841116 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.845559 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846651 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.846425 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847060 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847359 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.851652 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.847438 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.848122 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.848148 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.848350 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.848403 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.848680 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.849172 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.849443 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.849519 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.850042 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.850767 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.850889 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.841578 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.851047 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.851334 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.851941 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.851942 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852215 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.852350 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.852377 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.852445 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:40.352424305 +0000 UTC m=+18.840990428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852349 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852480 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852504 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.851677 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852618 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852645 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852922 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.852988 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.853489 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.853799 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.854100 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.854188 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.854337 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.855140 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.855357 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.858087 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.859106 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.859118 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.859431 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.859528 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.859568 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.859586 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.859723 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.859950 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.860243 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.860266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: E1216 06:55:39.860791 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:40.359643202 +0000 UTC m=+18.848209405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.862542 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.863740 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.866211 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.866405 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.866582 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.866690 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.868356 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.868453 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.869194 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.871443 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.872396 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.872520 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.872956 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.873004 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.873180 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.873909 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.874042 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.875617 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.876244 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.876280 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.876438 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.877097 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.877920 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.877952 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.880136 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.881615 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41" exitCode=255 Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.881666 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41"} Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.890371 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.895564 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.896212 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n248g"] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.896767 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-964hc"] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.897650 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.898154 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.898730 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fv56f"] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.899191 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.899455 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.899723 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwjhk"] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.904313 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.904690 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.904910 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.905171 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.905342 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.905642 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.906116 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.906566 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.906692 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.907133 4823 scope.go:117] "RemoveContainer" containerID="9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.907694 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.907829 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.907896 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.907762 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.908065 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911005 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911255 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911301 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911048 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911691 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911119 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911895 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.911125 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.914454 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.915807 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.921678 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.933261 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.942494 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.948622 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-netd\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.948658 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-cni-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.948674 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-cni-bin\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.948694 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25dec47c-3043-486c-b371-2be103c214e3-mcd-auth-proxy-config\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.948715 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzgs7\" (UniqueName: \"kubernetes.io/projected/08e48f89-7095-4ea2-afb5-759591c2b0d4-kube-api-access-qzgs7\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.948745 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949222 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949252 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949287 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949316 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-ovn\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949333 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949339 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949637 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-etc-kubernetes\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949698 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25dec47c-3043-486c-b371-2be103c214e3-proxy-tls\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949790 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cni-binary-copy\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949844 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-etc-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949872 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949909 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-node-log\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949939 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-system-cni-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949965 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-cnibin\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.949989 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-os-release\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950014 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-netns\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950071 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skkc\" (UniqueName: \"kubernetes.io/projected/1b377757-dbc6-4d9c-9656-3ff65d7d113a-kube-api-access-2skkc\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950101 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-hostroot\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950206 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-cni-multus\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950292 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950352 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-env-overrides\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950425 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-script-lib\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950460 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cnibin\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950486 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-var-lib-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950523 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-daemon-config\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950557 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sdtm\" (UniqueName: \"kubernetes.io/projected/2d31d032-9142-4e26-9f06-e3a5ea73d530-kube-api-access-7sdtm\") pod \"node-resolver-hr8h5\" (UID: \"2d31d032-9142-4e26-9f06-e3a5ea73d530\") " pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950592 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmpf8\" (UniqueName: \"kubernetes.io/projected/25dec47c-3043-486c-b371-2be103c214e3-kube-api-access-jmpf8\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950656 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-system-cni-dir\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950683 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-os-release\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950718 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5scf\" (UniqueName: \"kubernetes.io/projected/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-kube-api-access-r5scf\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.950745 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovn-node-metrics-cert\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951453 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-kubelet\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951538 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-log-socket\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951714 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b377757-dbc6-4d9c-9656-3ff65d7d113a-cni-binary-copy\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951800 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-k8s-cni-cncf-io\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951832 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-conf-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951875 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-socket-dir-parent\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951916 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d31d032-9142-4e26-9f06-e3a5ea73d530-hosts-file\") pod \"node-resolver-hr8h5\" (UID: \"2d31d032-9142-4e26-9f06-e3a5ea73d530\") " pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.951973 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-systemd-units\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952006 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-slash\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952079 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952110 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-bin\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952138 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-config\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952184 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d31d032-9142-4e26-9f06-e3a5ea73d530-hosts-file\") pod \"node-resolver-hr8h5\" (UID: \"2d31d032-9142-4e26-9f06-e3a5ea73d530\") " pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952264 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/25dec47c-3043-486c-b371-2be103c214e3-rootfs\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952331 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-systemd\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952433 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-kubelet\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952466 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-multus-certs\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952490 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-netns\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952754 4823 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952774 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952821 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952840 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952856 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952871 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952892 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952905 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952921 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952939 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952952 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952965 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952979 4823 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.952997 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953010 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953037 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953051 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953098 4823 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953113 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953127 4823 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953138 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953153 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953168 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953179 4823 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953194 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953205 4823 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953215 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953243 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953260 4823 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953273 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953286 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953297 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953311 4823 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953323 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953334 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953350 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953362 4823 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953375 4823 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953390 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953406 4823 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953418 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953430 4823 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953442 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953456 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953467 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953478 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953489 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.953504 4823 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954283 4823 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954296 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954343 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954362 4823 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954376 4823 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954389 4823 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.954381 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956300 4823 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956344 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956360 4823 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956373 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956393 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956410 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956432 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956451 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956465 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956479 4823 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956492 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956510 4823 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956523 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956537 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956552 4823 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956572 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956592 4823 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956605 4823 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956618 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956635 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956649 4823 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956664 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956682 4823 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956701 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.956716 4823 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.966300 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.970286 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sdtm\" (UniqueName: \"kubernetes.io/projected/2d31d032-9142-4e26-9f06-e3a5ea73d530-kube-api-access-7sdtm\") pod \"node-resolver-hr8h5\" (UID: \"2d31d032-9142-4e26-9f06-e3a5ea73d530\") " pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.977340 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.988911 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:39 crc kubenswrapper[4823]: I1216 06:55:39.997509 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.008434 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.020280 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.029779 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.030579 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.038042 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.042527 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.048210 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.053340 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hr8h5" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.056536 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057422 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-kubelet\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057462 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-log-socket\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057483 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b377757-dbc6-4d9c-9656-3ff65d7d113a-cni-binary-copy\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057502 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-k8s-cni-cncf-io\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057517 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-conf-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057537 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-socket-dir-parent\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057551 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-bin\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057570 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-config\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057621 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-systemd-units\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057637 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-slash\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057651 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057672 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/25dec47c-3043-486c-b371-2be103c214e3-rootfs\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057694 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-systemd\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057715 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-netns\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057788 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-kubelet\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057809 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-multus-certs\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057848 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-netd\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057863 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-cni-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057878 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-cni-bin\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057901 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25dec47c-3043-486c-b371-2be103c214e3-mcd-auth-proxy-config\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057939 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzgs7\" (UniqueName: \"kubernetes.io/projected/08e48f89-7095-4ea2-afb5-759591c2b0d4-kube-api-access-qzgs7\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057959 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-ovn\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.057979 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058006 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058050 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058079 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-node-log\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058101 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-etc-kubernetes\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058123 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25dec47c-3043-486c-b371-2be103c214e3-proxy-tls\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058148 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cni-binary-copy\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058168 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-etc-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058189 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-cnibin\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058203 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-os-release\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058223 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-netns\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058247 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-system-cni-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058270 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-hostroot\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058302 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skkc\" (UniqueName: \"kubernetes.io/projected/1b377757-dbc6-4d9c-9656-3ff65d7d113a-kube-api-access-2skkc\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058322 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-cni-multus\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058343 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058360 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-env-overrides\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058398 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-script-lib\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058413 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-daemon-config\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058432 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cnibin\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058449 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-var-lib-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058465 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-os-release\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058491 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5scf\" (UniqueName: \"kubernetes.io/projected/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-kube-api-access-r5scf\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058513 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovn-node-metrics-cert\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058539 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmpf8\" (UniqueName: \"kubernetes.io/projected/25dec47c-3043-486c-b371-2be103c214e3-kube-api-access-jmpf8\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058562 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-system-cni-dir\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058638 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-system-cni-dir\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058679 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-kubelet\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.058698 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-log-socket\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059087 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059171 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-netns\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059216 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-multus-certs\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059187 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-kubelet\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059269 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-k8s-cni-cncf-io\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059304 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-conf-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059317 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-netd\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059377 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b377757-dbc6-4d9c-9656-3ff65d7d113a-cni-binary-copy\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059411 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-node-log\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059430 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-cni-multus\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059464 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-etc-kubernetes\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059497 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-bin\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059502 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-socket-dir-parent\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059707 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-var-lib-cni-bin\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059759 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-slash\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059747 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059791 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-var-lib-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059757 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-systemd-units\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059824 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059852 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/25dec47c-3043-486c-b371-2be103c214e3-rootfs\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059897 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059936 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-ovn\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.059984 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-os-release\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060339 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-os-release\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060464 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-system-cni-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060478 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-etc-openvswitch\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060392 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-config\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060455 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cnibin\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060509 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-systemd\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060536 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-hostroot\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060576 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-cnibin\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060681 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25dec47c-3043-486c-b371-2be103c214e3-mcd-auth-proxy-config\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060391 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-host-run-netns\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.060822 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cni-binary-copy\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.061040 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-env-overrides\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.061532 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-cni-dir\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.061525 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b377757-dbc6-4d9c-9656-3ff65d7d113a-multus-daemon-config\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.061559 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-script-lib\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.062202 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: W1216 06:55:40.065535 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-f985582e3396c62b8f5340be4b9a6c74e07db9674fed3f1a3d51c62cdf37450c WatchSource:0}: Error finding container f985582e3396c62b8f5340be4b9a6c74e07db9674fed3f1a3d51c62cdf37450c: Status 404 returned error can't find the container with id f985582e3396c62b8f5340be4b9a6c74e07db9674fed3f1a3d51c62cdf37450c Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.065819 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.076509 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovn-node-metrics-cert\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.078377 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25dec47c-3043-486c-b371-2be103c214e3-proxy-tls\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.082687 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmpf8\" (UniqueName: \"kubernetes.io/projected/25dec47c-3043-486c-b371-2be103c214e3-kube-api-access-jmpf8\") pod \"machine-config-daemon-fv56f\" (UID: \"25dec47c-3043-486c-b371-2be103c214e3\") " pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.085713 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5scf\" (UniqueName: \"kubernetes.io/projected/a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c-kube-api-access-r5scf\") pod \"multus-additional-cni-plugins-964hc\" (UID: \"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\") " pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.098767 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzgs7\" (UniqueName: \"kubernetes.io/projected/08e48f89-7095-4ea2-afb5-759591c2b0d4-kube-api-access-qzgs7\") pod \"ovnkube-node-zwjhk\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.116747 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.124271 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skkc\" (UniqueName: \"kubernetes.io/projected/1b377757-dbc6-4d9c-9656-3ff65d7d113a-kube-api-access-2skkc\") pod \"multus-n248g\" (UID: \"1b377757-dbc6-4d9c-9656-3ff65d7d113a\") " pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.182395 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.211638 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.224465 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n248g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.235763 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-964hc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.243251 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.252596 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.366428 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.375174 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.375254 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.375416 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.375457 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378159 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:55:41.378119617 +0000 UTC m=+19.866686060 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378187 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378274 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378304 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:41.378279552 +0000 UTC m=+19.866845675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378330 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378353 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378366 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378281 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378449 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378384 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:41.378348924 +0000 UTC m=+19.866915057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378477 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378522 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:41.378484919 +0000 UTC m=+19.867051042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:40 crc kubenswrapper[4823]: E1216 06:55:40.378593 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:41.378568651 +0000 UTC m=+19.867134834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.897154 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.897677 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"512e2be9b1f2abff23016b09c065fc232e71d62a234f19773d7ead5be86bf63d"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.898944 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hr8h5" event={"ID":"2d31d032-9142-4e26-9f06-e3a5ea73d530","Type":"ContainerStarted","Data":"502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.898977 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hr8h5" event={"ID":"2d31d032-9142-4e26-9f06-e3a5ea73d530","Type":"ContainerStarted","Data":"39ea13f1987ce0d6ea7ad1ee85a0e6713732ee81b488d006a36175232bc4076f"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.905543 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.905634 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3b915197974524b79b6e5d494ff2bdd9cafaa0d0b2073dd72eb48d7d759b3896"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.906893 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d66e8c1b1de0a70638be23e8c6868e99f3a1ed7b7e3a259af6a4444cefb8d86b"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.908616 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" exitCode=0 Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.908686 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.908709 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"bc79567ff36b0deb62e06420030baecbdaf9941bbd19c29426be5e0056970c21"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.912206 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerStarted","Data":"78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.912241 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerStarted","Data":"73a7b14ac27f111b5bc97b0c7705f86e48d91dca5dfc0500788958232816bb32"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.915994 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.916743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.916785 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.916797 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f985582e3396c62b8f5340be4b9a6c74e07db9674fed3f1a3d51c62cdf37450c"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.921427 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerStarted","Data":"4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.921466 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerStarted","Data":"47062d92fe1df93c678fef27723b2f31d32f60860d5190f7e90beb2dae13d6f7"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.923462 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.925987 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8"} Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.926844 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.943757 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.959292 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.977128 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:40 crc kubenswrapper[4823]: I1216 06:55:40.994875 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.012406 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.030502 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.054051 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.097548 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.114940 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.134263 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.149581 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.174668 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.203119 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.236251 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.252121 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.266137 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.277955 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.296480 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.312824 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.327805 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.345896 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.365224 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.385116 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385318 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:55:43.385290986 +0000 UTC m=+21.873857099 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.385422 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.385447 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.385477 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.385507 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385606 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385626 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385648 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385670 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385684 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385685 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:43.385660267 +0000 UTC m=+21.874226390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385691 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385707 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:43.385699719 +0000 UTC m=+21.874265942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385709 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385727 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385763 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:43.38575468 +0000 UTC m=+21.874320803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.385779 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:43.385772751 +0000 UTC m=+21.874338874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.391678 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.495549 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.501468 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.506447 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.522319 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.538053 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.564534 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.581310 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.593710 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.613006 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.625268 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.637973 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.654259 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.667810 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.688731 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.738948 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.759947 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.772387 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.772493 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.772797 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.772848 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.772355 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:41 crc kubenswrapper[4823]: E1216 06:55:41.773688 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.775804 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.778955 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.779754 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.780401 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.780996 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.782372 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.782878 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.783878 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.784434 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.785497 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.785976 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.786860 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.787541 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.787763 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.788069 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.788957 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.789478 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.793389 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.793952 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.794359 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.795274 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.795860 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.796711 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.797267 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.797683 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.798645 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.799023 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.800012 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.800782 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.801610 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.802194 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.803015 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.803124 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.803488 4823 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.803611 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.805208 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.806073 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.806489 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.808255 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.809390 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.809946 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.810915 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.811554 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.814181 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.814631 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.814750 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.815728 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.816325 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.817140 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.817661 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.818547 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.819232 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.820089 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.820558 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.821374 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.821907 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.822623 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.823468 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.837819 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.867561 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.906985 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.931495 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.938262 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.938314 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.938326 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.938335 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.943685 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4"} Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.945166 4823 generic.go:334] "Generic (PLEG): container finished" podID="a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c" containerID="4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec" exitCode=0 Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.945701 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerDied","Data":"4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec"} Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.964860 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:41 crc kubenswrapper[4823]: I1216 06:55:41.987640 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.010896 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.026916 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.049261 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.066538 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.083551 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.121101 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.143446 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.158282 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.175663 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.194037 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.214512 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.228829 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.248995 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.264089 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bwcng"] Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.264533 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.266506 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.266670 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.266695 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.266873 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.269800 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.325647 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.361773 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.399728 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.402263 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ff057ef-c324-4465-8b8d-c7b98c25b23c-host\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.402373 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ff057ef-c324-4465-8b8d-c7b98c25b23c-serviceca\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.402397 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2bj4\" (UniqueName: \"kubernetes.io/projected/8ff057ef-c324-4465-8b8d-c7b98c25b23c-kube-api-access-b2bj4\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.409142 4823 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.411674 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.411714 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.411723 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.411839 4823 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.440745 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.491982 4823 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.492354 4823 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.493549 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.493581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.493592 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.493610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.493621 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.503286 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ff057ef-c324-4465-8b8d-c7b98c25b23c-host\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.503400 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ff057ef-c324-4465-8b8d-c7b98c25b23c-serviceca\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.503426 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2bj4\" (UniqueName: \"kubernetes.io/projected/8ff057ef-c324-4465-8b8d-c7b98c25b23c-kube-api-access-b2bj4\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.505107 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ff057ef-c324-4465-8b8d-c7b98c25b23c-host\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.505158 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ff057ef-c324-4465-8b8d-c7b98c25b23c-serviceca\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: E1216 06:55:42.515254 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.519874 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.520056 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.520091 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.520099 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.520115 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.520126 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: E1216 06:55:42.533267 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.536816 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.536876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.536892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.536910 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.536922 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.552420 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2bj4\" (UniqueName: \"kubernetes.io/projected/8ff057ef-c324-4465-8b8d-c7b98c25b23c-kube-api-access-b2bj4\") pod \"node-ca-bwcng\" (UID: \"8ff057ef-c324-4465-8b8d-c7b98c25b23c\") " pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: E1216 06:55:42.555383 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.562675 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.562715 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.562730 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.562745 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.562755 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: E1216 06:55:42.578127 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.582018 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bwcng" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.582437 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.587449 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.587514 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.587527 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.587550 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.587564 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: E1216 06:55:42.603387 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: E1216 06:55:42.603553 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.605868 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.605911 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.605929 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.605949 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.605962 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.623499 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.664449 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.711655 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.711705 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.711715 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.711735 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.711747 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.715549 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.762639 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.785232 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.816192 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.816242 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.816258 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.816276 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.816288 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.820955 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.861255 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.902527 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.924589 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.924637 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.924648 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.924667 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.924680 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:42Z","lastTransitionTime":"2025-12-16T06:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.942738 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.951222 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerStarted","Data":"d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.952320 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bwcng" event={"ID":"8ff057ef-c324-4465-8b8d-c7b98c25b23c","Type":"ContainerStarted","Data":"4ed5f64917125d604abae45206362ca626fa0f205c3c004f955c34aa2322f4d8"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.956741 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.956796 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} Dec 16 06:55:42 crc kubenswrapper[4823]: I1216 06:55:42.980245 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.021661 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.027958 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.028013 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.028053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.028072 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.028083 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.063460 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.086433 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.099416 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.101727 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.122735 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.131160 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.136109 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.136240 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.136265 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.136277 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.161452 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.200640 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.237795 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.239237 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.239261 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.239274 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.239292 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.239304 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.281547 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.322061 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.341298 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.341341 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.341354 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.341371 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.341383 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.358915 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.404770 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.413245 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.413369 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.413388 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.413411 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.413429 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413540 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413554 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413565 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413601 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:47.413588454 +0000 UTC m=+25.902154577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413652 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:55:47.413646096 +0000 UTC m=+25.902212219 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413693 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413712 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:47.413706988 +0000 UTC m=+25.902273111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413748 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413767 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:47.41376209 +0000 UTC m=+25.902328213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413803 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413812 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413819 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.413837 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:47.413832672 +0000 UTC m=+25.902398795 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.437841 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.443725 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.443790 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.443801 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.443817 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.443827 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.478087 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.519845 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.546316 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.546354 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.546363 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.546376 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.546386 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.566943 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.600314 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.638796 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.648916 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.649293 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.649307 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.649329 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.649340 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.678344 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.722506 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.751466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.751509 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.751519 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.751533 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.751543 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.759622 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.771372 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.771451 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.771460 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.771496 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.771605 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.771686 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.798818 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.839579 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.853876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.853929 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.853956 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.853975 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.853985 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.879625 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.918851 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.958462 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.958520 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.958555 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.958576 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.958603 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:43Z","lastTransitionTime":"2025-12-16T06:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.959211 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:43Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.961806 4823 generic.go:334] "Generic (PLEG): container finished" podID="a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c" containerID="d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94" exitCode=0 Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.961863 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerDied","Data":"d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.963980 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b"} Dec 16 06:55:43 crc kubenswrapper[4823]: I1216 06:55:43.966055 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bwcng" event={"ID":"8ff057ef-c324-4465-8b8d-c7b98c25b23c","Type":"ContainerStarted","Data":"ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8"} Dec 16 06:55:43 crc kubenswrapper[4823]: E1216 06:55:43.997397 4823 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.017809 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.060401 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.061525 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.061566 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.061578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.061593 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.061862 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.097593 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.146789 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.170445 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.170491 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.170503 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.170522 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.170535 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.182787 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.219315 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.262693 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.273170 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.273213 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.273226 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.273243 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.273263 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.297098 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.338039 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.376065 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.376101 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.376362 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.376392 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.376405 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.380511 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.419145 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.458781 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.478699 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.478735 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.478745 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.478762 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.478775 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.498433 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.537009 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.581750 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.581787 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.581801 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.581819 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.581832 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.588653 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.624227 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.658504 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.683955 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.683988 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.683998 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.684011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.684045 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.697610 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.736275 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.786622 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.786653 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.786662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.786675 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.786684 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.888312 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.888355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.888366 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.888382 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.888396 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.971381 4823 generic.go:334] "Generic (PLEG): container finished" podID="a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c" containerID="2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835" exitCode=0 Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.971449 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerDied","Data":"2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.976718 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.985861 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:44Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.998121 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.998198 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.998212 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.998230 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:44 crc kubenswrapper[4823]: I1216 06:55:44.998246 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:44Z","lastTransitionTime":"2025-12-16T06:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.012838 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.028447 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.041788 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.058259 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.071074 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.086487 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.098863 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.100774 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.100810 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.100828 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.100845 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.100856 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.108935 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.141742 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.181332 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.203001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.203043 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.203054 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.203074 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.203083 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.218826 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.258398 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.296646 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.305221 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.305276 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.305288 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.305305 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.305316 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.343269 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:45Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.408548 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.408606 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.408617 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.408631 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.408639 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.511137 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.511183 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.511207 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.511228 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.511243 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.613673 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.613716 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.613728 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.613746 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.613759 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.716345 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.716400 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.716409 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.716425 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.716434 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.771148 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.771224 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.771342 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:45 crc kubenswrapper[4823]: E1216 06:55:45.771347 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:45 crc kubenswrapper[4823]: E1216 06:55:45.771486 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:45 crc kubenswrapper[4823]: E1216 06:55:45.771598 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.819167 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.819211 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.819227 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.819245 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.819268 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.922128 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.922166 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.922175 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.922191 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.922201 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:45Z","lastTransitionTime":"2025-12-16T06:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.982752 4823 generic.go:334] "Generic (PLEG): container finished" podID="a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c" containerID="c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f" exitCode=0 Dec 16 06:55:45 crc kubenswrapper[4823]: I1216 06:55:45.982811 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerDied","Data":"c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.014917 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.024223 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.024267 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.024311 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.024331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.024342 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.035991 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.050636 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.065601 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.076277 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.090512 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.109011 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.124199 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.126876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.126905 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.126915 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.126930 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.126941 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.140360 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.153607 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.165301 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.178385 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.190767 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.208420 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.223010 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:46Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.229817 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.229868 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.229880 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.229897 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.229907 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.334739 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.334790 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.334803 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.334819 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.334828 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.436806 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.436850 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.436860 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.436876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.436885 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.539266 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.539324 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.539338 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.539355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.539365 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.641485 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.641788 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.641875 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.641956 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.642050 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.744870 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.744914 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.744926 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.744943 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.744953 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.848798 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.848844 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.848852 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.848869 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.848879 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.956245 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.956297 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.956306 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.956321 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.956330 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:46Z","lastTransitionTime":"2025-12-16T06:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:46 crc kubenswrapper[4823]: I1216 06:55:46.988975 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerStarted","Data":"5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.014745 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.030570 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.050105 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.061794 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.062102 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.062114 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.062131 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.062141 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.063820 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.077688 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.089645 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.115771 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.130132 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.142853 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.154975 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.167159 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.167190 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.167199 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.167212 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.167222 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.171338 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.185075 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.200200 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.217329 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.247071 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:47Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.269499 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.269551 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.269560 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.269575 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.269599 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.371526 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.371551 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.371559 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.371587 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.371596 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.457740 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.457856 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.457979 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.458008 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458068 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:55:55.458050728 +0000 UTC m=+33.946616851 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.458090 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458124 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458149 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458214 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:55.458199833 +0000 UTC m=+33.946765956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458235 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:55.458224314 +0000 UTC m=+33.946790437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458295 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458314 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458328 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458375 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458384 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458390 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458419 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:55.45841204 +0000 UTC m=+33.946978173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.458471 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:55.458464841 +0000 UTC m=+33.947030964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.473733 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.473766 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.473776 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.473793 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.473803 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.576830 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.576867 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.576877 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.576895 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.577165 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.679228 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.679252 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.679261 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.679275 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.679284 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.770714 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.770837 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.771204 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.771234 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.771276 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:47 crc kubenswrapper[4823]: E1216 06:55:47.771370 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.781512 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.781636 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.781646 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.781662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.781673 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.884187 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.884220 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.884256 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.884272 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:47 crc kubenswrapper[4823]: I1216 06:55:47.884282 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:47Z","lastTransitionTime":"2025-12-16T06:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.020096 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.020137 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.020150 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.020166 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.020176 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.025145 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.025995 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.029433 4823 generic.go:334] "Generic (PLEG): container finished" podID="a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c" containerID="5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428" exitCode=0 Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.029490 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerDied","Data":"5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.045190 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.052679 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.060520 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.074619 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.089984 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.105273 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.122220 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.127517 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.127552 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.127563 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.127581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.127592 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.136786 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.148035 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.166162 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.180191 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.181517 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.199336 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.214294 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.226644 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.235878 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.236224 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.236234 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.236249 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.236259 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.242595 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.262124 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.279909 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.299200 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.314002 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.327121 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.340827 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.340880 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.340895 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.340915 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.341333 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.341594 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.356910 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.370765 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.385447 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.396781 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.406958 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.418732 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.429205 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.439191 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.443355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.443383 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.443393 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.443408 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.443438 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.449764 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.458248 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:48Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.545404 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.545446 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.545460 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.545480 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.545498 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.648234 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.648269 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.648277 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.648291 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.648299 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.752368 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.752412 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.752424 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.752442 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.752454 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.855263 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.855305 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.855316 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.855331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.855342 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.958005 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.958054 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.958067 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.958084 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:48 crc kubenswrapper[4823]: I1216 06:55:48.958095 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:48Z","lastTransitionTime":"2025-12-16T06:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.034138 4823 generic.go:334] "Generic (PLEG): container finished" podID="a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c" containerID="91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd" exitCode=0 Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.034879 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerDied","Data":"91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.035291 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.059673 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.060405 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.060482 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.060491 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.060508 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.060518 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.084393 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.096801 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.115108 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.125375 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.129932 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.145695 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.164206 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.165080 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.165110 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.165122 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.165139 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.165153 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.177352 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.188620 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.201883 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.223361 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.237631 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.251622 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.263225 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.267876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.267920 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.267930 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.267946 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.267956 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.272380 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.284417 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.296227 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.306071 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.327388 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.338234 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.350558 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.364781 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.370652 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.370689 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.370701 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.370716 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.370726 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.379211 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.391491 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.402684 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.412751 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.426557 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.441291 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.452535 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.472892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.472951 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.472966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.472990 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.473009 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.475422 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:49Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.576148 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.576195 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.576205 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.576221 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.576244 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.679115 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.679163 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.679175 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.679193 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.679201 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.771289 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.771358 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.771427 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:49 crc kubenswrapper[4823]: E1216 06:55:49.771436 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:49 crc kubenswrapper[4823]: E1216 06:55:49.771596 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:49 crc kubenswrapper[4823]: E1216 06:55:49.771762 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.782404 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.782438 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.782448 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.782466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.782477 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.884679 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.884726 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.884734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.884753 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.884762 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.992359 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.992409 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.992419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.992435 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:49 crc kubenswrapper[4823]: I1216 06:55:49.992444 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:49Z","lastTransitionTime":"2025-12-16T06:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.041932 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" event={"ID":"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c","Type":"ContainerStarted","Data":"0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.063852 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.092909 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.095819 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.095862 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.095893 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.095914 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.095926 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.108297 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.120413 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.140170 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.157974 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.169323 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.187851 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.203495 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.203537 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.203547 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.203561 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.203571 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.204760 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.230097 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.242855 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.254008 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.266885 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.283336 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.294192 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.306309 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.306364 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.306376 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.306405 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.306416 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.415773 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.415827 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.415838 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.415855 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.415866 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.517984 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.518035 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.518047 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.518081 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.518093 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.620734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.620778 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.620787 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.620803 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.620816 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.725150 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.725180 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.725190 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.725206 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.725218 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.827937 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.827978 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.827987 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.828001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.828013 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.930468 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.930504 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.930512 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.930526 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.930535 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:50Z","lastTransitionTime":"2025-12-16T06:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.934192 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.946695 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.959367 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.973840 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.986543 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:50 crc kubenswrapper[4823]: I1216 06:55:50.998265 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:50Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.008500 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.018539 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.031812 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.034259 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.034306 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.034315 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.034331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.034340 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.045806 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.046424 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/0.log" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.049452 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383" exitCode=1 Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.049493 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.050374 4823 scope.go:117] "RemoveContainer" containerID="00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.056684 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.076666 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.095509 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.109865 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.121412 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.131856 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.137069 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.137114 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.137127 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.137146 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.137160 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.144480 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.156186 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.167782 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.178884 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.188689 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.197890 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.210335 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.222409 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.233586 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.241582 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.241612 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.241622 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.241636 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.241646 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.242311 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.257892 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.269065 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.281006 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.302766 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.316356 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.343280 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.343319 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.343328 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.343343 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.343356 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.445426 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.445475 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.445484 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.445508 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.445519 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.549557 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.549618 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.549628 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.549646 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.549658 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.651658 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.651707 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.651718 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.651735 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.651751 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.753576 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.753615 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.753624 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.753641 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.753650 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.770802 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.770931 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:51 crc kubenswrapper[4823]: E1216 06:55:51.771044 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.771057 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:51 crc kubenswrapper[4823]: E1216 06:55:51.771169 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:51 crc kubenswrapper[4823]: E1216 06:55:51.771266 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.791297 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.811384 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.825549 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.836590 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.845747 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.855093 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.855120 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.855129 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.855143 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.855153 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.865797 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.879198 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.895175 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.911764 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.926769 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.941258 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.954005 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.957397 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.957426 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.957437 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.957467 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.957479 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:51Z","lastTransitionTime":"2025-12-16T06:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.967633 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.977494 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:51 crc kubenswrapper[4823]: I1216 06:55:51.986236 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.054862 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/0.log" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.057553 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.057927 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.059214 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.059246 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.059255 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.059268 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.059280 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.085059 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.116808 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.133165 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.146506 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.160574 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.162319 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.162353 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.162364 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.162378 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.162389 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.172388 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.182564 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.193224 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.203995 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.213715 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.223258 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.235696 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.250227 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.260803 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.264621 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.264667 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.264677 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.264693 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.264704 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.280133 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.367437 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.367479 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.367487 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.367502 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.367511 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.470970 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.471019 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.471046 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.471061 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.471072 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.537346 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh"] Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.538278 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.540780 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.542261 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.559625 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.573725 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.573766 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.573807 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.573834 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.573847 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.576633 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.592227 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.606843 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.618906 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.625643 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/42574a3f-0701-4192-b16c-bdb9be6c2888-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.625739 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/42574a3f-0701-4192-b16c-bdb9be6c2888-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.625806 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/42574a3f-0701-4192-b16c-bdb9be6c2888-kube-api-access-t54x6\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.625840 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/42574a3f-0701-4192-b16c-bdb9be6c2888-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.700413 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.701176 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.701215 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.701223 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.701237 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.701246 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.714313 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.724501 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.737129 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.745669 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.745719 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.745732 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.745751 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.745765 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.747776 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: E1216 06:55:52.758212 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.761162 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.761186 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.761193 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.761207 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.761215 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.769821 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: E1216 06:55:52.773445 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.776368 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.776401 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.776411 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.776432 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.776451 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: E1216 06:55:52.787731 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.789220 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.791517 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.791559 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.791569 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.791587 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.791602 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.800489 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/42574a3f-0701-4192-b16c-bdb9be6c2888-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.800563 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/42574a3f-0701-4192-b16c-bdb9be6c2888-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.800657 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/42574a3f-0701-4192-b16c-bdb9be6c2888-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.800733 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/42574a3f-0701-4192-b16c-bdb9be6c2888-kube-api-access-t54x6\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.802060 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/42574a3f-0701-4192-b16c-bdb9be6c2888-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.802257 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/42574a3f-0701-4192-b16c-bdb9be6c2888-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.803519 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: E1216 06:55:52.805375 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.809610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.809653 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.809665 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.809681 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.809690 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.809998 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/42574a3f-0701-4192-b16c-bdb9be6c2888-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.817228 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.818973 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54x6\" (UniqueName: \"kubernetes.io/projected/42574a3f-0701-4192-b16c-bdb9be6c2888-kube-api-access-t54x6\") pod \"ovnkube-control-plane-749d76644c-v5mgh\" (UID: \"42574a3f-0701-4192-b16c-bdb9be6c2888\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:52 crc kubenswrapper[4823]: E1216 06:55:52.821584 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: E1216 06:55:52.821712 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.825106 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.825144 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.825157 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.825174 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.825185 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.829681 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.841956 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.928116 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.928157 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.928167 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.928181 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:52 crc kubenswrapper[4823]: I1216 06:55:52.928191 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:52Z","lastTransitionTime":"2025-12-16T06:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.008362 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" Dec 16 06:55:53 crc kubenswrapper[4823]: W1216 06:55:53.021708 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42574a3f_0701_4192_b16c_bdb9be6c2888.slice/crio-79a226ee1b34ce3fabfe11feac9b7299a7a4597e64edf8ebeda444379b3e8844 WatchSource:0}: Error finding container 79a226ee1b34ce3fabfe11feac9b7299a7a4597e64edf8ebeda444379b3e8844: Status 404 returned error can't find the container with id 79a226ee1b34ce3fabfe11feac9b7299a7a4597e64edf8ebeda444379b3e8844 Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.031542 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.031581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.031590 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.031607 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.031617 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.061237 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" event={"ID":"42574a3f-0701-4192-b16c-bdb9be6c2888","Type":"ContainerStarted","Data":"79a226ee1b34ce3fabfe11feac9b7299a7a4597e64edf8ebeda444379b3e8844"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.063697 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/1.log" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.064452 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/0.log" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.067935 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843" exitCode=1 Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.068050 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.068275 4823 scope.go:117] "RemoveContainer" containerID="00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.069179 4823 scope.go:117] "RemoveContainer" containerID="a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843" Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.069432 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.084451 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.100541 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.115331 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.128517 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.137241 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.137288 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.137299 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.137317 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.137328 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.140098 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.150771 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.163445 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.175110 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.193240 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.210466 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.223317 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.235653 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.239287 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.239328 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.239340 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.239358 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.239371 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.249840 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.261650 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.276314 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.293701 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.341599 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.341641 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.341653 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.341668 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.341678 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.444446 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.444504 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.444516 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.444538 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.444549 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.546771 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.546816 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.546830 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.546845 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.546855 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.635886 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8mn7l"] Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.636451 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.636716 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.649344 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.649390 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.649403 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.649419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.649433 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.651509 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.664164 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.674235 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.690778 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.702346 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.710732 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.710784 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2crtc\" (UniqueName: \"kubernetes.io/projected/1e7dd738-a9b3-455c-93e0-3f0dc7327817-kube-api-access-2crtc\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.715199 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.726018 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.735978 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.748550 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.751885 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.751930 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.751945 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.751966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.751981 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.761986 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.771126 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.771338 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.771424 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.771330 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.771571 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.771674 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.771871 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.793180 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.811879 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.811952 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2crtc\" (UniqueName: \"kubernetes.io/projected/1e7dd738-a9b3-455c-93e0-3f0dc7327817-kube-api-access-2crtc\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.812329 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:53 crc kubenswrapper[4823]: E1216 06:55:53.812381 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:54.312368272 +0000 UTC m=+32.800934395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.812722 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.830480 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2crtc\" (UniqueName: \"kubernetes.io/projected/1e7dd738-a9b3-455c-93e0-3f0dc7327817-kube-api-access-2crtc\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.854975 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.855009 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.855018 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.855054 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.855074 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.858960 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.898654 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.938202 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.957295 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.957591 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.957671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.957760 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.957833 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:53Z","lastTransitionTime":"2025-12-16T06:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:53 crc kubenswrapper[4823]: I1216 06:55:53.980457 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:53Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.060297 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.060559 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.060675 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.060760 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.060835 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.072395 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" event={"ID":"42574a3f-0701-4192-b16c-bdb9be6c2888","Type":"ContainerStarted","Data":"f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.072457 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" event={"ID":"42574a3f-0701-4192-b16c-bdb9be6c2888","Type":"ContainerStarted","Data":"d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.075915 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/1.log" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.079182 4823 scope.go:117] "RemoveContainer" containerID="a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843" Dec 16 06:55:54 crc kubenswrapper[4823]: E1216 06:55:54.079331 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.091105 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00935cc6d346797a92ef9e563af373278845a965676a245ecd98969758468383\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"message\\\":\\\":55:50.835539 6074 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835626 6074 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.835693 6074 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:55:50.836226 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:55:50.836261 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:55:50.836286 6074 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 06:55:50.836302 6074 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:55:50.836337 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:55:50.836340 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 06:55:50.836351 6074 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:55:50.836356 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 06:55:50.836366 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 06:55:50.836375 6074 factory.go:656] Stopping watch factory\\\\nI1216 06:55:50.836586 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.103828 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.114080 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.138817 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.162866 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.162961 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.162985 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.163053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.163079 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.195908 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.220311 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.258087 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.265896 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.265951 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.265963 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.265982 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.265994 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.300105 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.315756 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:54 crc kubenswrapper[4823]: E1216 06:55:54.315937 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:54 crc kubenswrapper[4823]: E1216 06:55:54.316191 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:55.316169224 +0000 UTC m=+33.804735357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.339645 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.369613 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.369689 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.369708 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.369735 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.369752 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.381387 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.420676 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.458769 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.473291 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.473335 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.473344 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.473361 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.473372 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.499929 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.541542 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.575867 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.575906 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.575915 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.575929 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.575939 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.583487 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.618328 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.657473 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.678150 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.678200 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.678212 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.678225 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.678233 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.699218 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.737697 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.779139 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.780927 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.780970 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.780984 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.781001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.781013 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.819907 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.862185 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.883550 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.883789 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.883851 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.883914 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.883969 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.898151 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.938721 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.976705 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.986604 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.986650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.986661 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.986676 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:54 crc kubenswrapper[4823]: I1216 06:55:54.986685 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:54Z","lastTransitionTime":"2025-12-16T06:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.026717 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.059434 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.089071 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.089121 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.089134 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.089158 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.089170 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.107724 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.143781 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.184352 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.192209 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.192284 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.192306 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.192336 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.192360 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.219782 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.264147 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.294949 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.295001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.295012 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.295055 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.295068 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.298649 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.325592 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.325767 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.325877 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:55:57.325848743 +0000 UTC m=+35.814414906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.343178 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.398564 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.398611 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.398620 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.398637 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.398651 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.502052 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.502103 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.502119 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.502136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.502148 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.526985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527144 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:56:11.527120431 +0000 UTC m=+50.015686554 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.527236 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.527268 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.527297 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.527328 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527375 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527478 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527495 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527508 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527527 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:11.527509474 +0000 UTC m=+50.016075667 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527547 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:11.527538065 +0000 UTC m=+50.016104188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527542 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527598 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527647 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527664 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527665 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:11.527641978 +0000 UTC m=+50.016208101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.527742 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:11.52771962 +0000 UTC m=+50.016285743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.605208 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.605269 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.605294 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.605325 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.605346 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.707793 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.707870 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.707892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.707927 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.707951 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.771599 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.771653 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.771655 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.771729 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.771739 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.771913 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.772222 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:55 crc kubenswrapper[4823]: E1216 06:55:55.772275 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.810972 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.811037 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.811053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.811072 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.811084 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.913575 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.913620 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.913632 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.913655 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:55 crc kubenswrapper[4823]: I1216 06:55:55.913667 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:55Z","lastTransitionTime":"2025-12-16T06:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.016610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.016669 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.016687 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.016713 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.016733 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.119899 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.119954 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.119963 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.119980 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.119989 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.222463 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.222520 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.222532 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.222550 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.222563 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.324972 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.325013 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.325058 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.325081 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.325093 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.427804 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.427878 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.427890 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.427906 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.427917 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.529822 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.529866 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.529874 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.529889 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.529897 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.633145 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.633209 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.633233 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.633264 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.633289 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.735857 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.735895 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.735906 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.735923 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.735935 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.838926 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.838966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.838982 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.839006 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.839058 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.947306 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.947368 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.947385 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.947413 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:56 crc kubenswrapper[4823]: I1216 06:55:56.947436 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:56Z","lastTransitionTime":"2025-12-16T06:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.049561 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.049607 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.049621 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.049641 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.049655 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.152908 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.152965 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.152979 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.153005 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.153049 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.256239 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.256305 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.256323 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.256348 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.256369 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.349283 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:57 crc kubenswrapper[4823]: E1216 06:55:57.349482 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:57 crc kubenswrapper[4823]: E1216 06:55:57.349567 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:01.349546715 +0000 UTC m=+39.838112838 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.359088 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.359322 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.359413 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.359491 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.359557 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.463016 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.463741 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.463902 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.464079 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.464273 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.567571 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.567627 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.567639 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.567666 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.567679 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.670795 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.670829 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.670839 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.670865 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.670878 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.771005 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.771097 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.771106 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.771121 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:57 crc kubenswrapper[4823]: E1216 06:55:57.771587 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:57 crc kubenswrapper[4823]: E1216 06:55:57.771592 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:57 crc kubenswrapper[4823]: E1216 06:55:57.771644 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:57 crc kubenswrapper[4823]: E1216 06:55:57.771778 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.773418 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.773445 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.773455 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.773470 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.773480 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.876067 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.876107 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.876119 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.876136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.876146 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.979074 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.979119 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.979133 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.979152 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:57 crc kubenswrapper[4823]: I1216 06:55:57.979162 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:57Z","lastTransitionTime":"2025-12-16T06:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.082480 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.082552 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.082578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.082608 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.082630 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.185688 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.185821 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.185845 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.185886 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.185905 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.288157 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.288185 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.288193 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.288207 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.288216 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.391436 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.391486 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.391496 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.391513 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.391523 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.493797 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.493823 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.493835 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.493849 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.493857 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.602154 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.602229 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.602251 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.602280 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.602301 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.705466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.705535 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.705559 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.705583 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.705601 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.809705 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.809789 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.809809 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.809844 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.809869 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.912680 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.912740 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.912758 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.912783 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:58 crc kubenswrapper[4823]: I1216 06:55:58.912801 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:58Z","lastTransitionTime":"2025-12-16T06:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.016185 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.016220 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.016228 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.016241 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.016252 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.119361 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.119394 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.119403 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.119419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.119432 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.221694 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.221736 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.221746 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.221763 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.221773 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.324716 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.324761 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.324771 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.324789 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.324799 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.427705 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.427734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.427746 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.427762 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.427771 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.530646 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.530712 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.530722 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.530740 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.530752 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.633339 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.633398 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.633411 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.633429 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.633441 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.737086 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.737131 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.737140 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.737158 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.737168 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.770756 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.770814 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.770808 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.770772 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:55:59 crc kubenswrapper[4823]: E1216 06:55:59.770943 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:55:59 crc kubenswrapper[4823]: E1216 06:55:59.771054 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:55:59 crc kubenswrapper[4823]: E1216 06:55:59.771104 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:55:59 crc kubenswrapper[4823]: E1216 06:55:59.771139 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.840510 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.840563 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.840575 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.840628 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.840663 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.943337 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.943403 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.943424 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.943455 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:55:59 crc kubenswrapper[4823]: I1216 06:55:59.943477 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:55:59Z","lastTransitionTime":"2025-12-16T06:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.046857 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.046901 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.046916 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.046936 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.046950 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.150441 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.150484 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.150502 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.150522 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.150535 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.253466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.253507 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.253516 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.253531 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.253540 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.355625 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.355657 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.355666 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.355680 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.355690 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.458470 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.458508 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.458518 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.458533 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.458541 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.561275 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.561355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.561379 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.561415 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.561443 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.663661 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.663713 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.663724 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.663743 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.663754 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.766218 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.766255 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.766267 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.766285 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.766296 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.871520 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.871581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.871596 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.871619 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.871646 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.974254 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.974290 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.974301 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.974317 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:00 crc kubenswrapper[4823]: I1216 06:56:00.974327 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:00Z","lastTransitionTime":"2025-12-16T06:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.076661 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.076688 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.076696 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.076710 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.076719 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.179366 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.179408 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.179418 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.179433 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.179443 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.281112 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.281155 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.281165 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.281180 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.281190 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.384375 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.384432 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.384448 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.384471 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.384488 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.395622 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:01 crc kubenswrapper[4823]: E1216 06:56:01.395799 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:01 crc kubenswrapper[4823]: E1216 06:56:01.395880 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:09.395858997 +0000 UTC m=+47.884425130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.487048 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.487096 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.487104 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.487125 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.487136 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.590625 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.590660 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.590672 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.590691 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.590703 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.693215 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.693261 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.693279 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.693300 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.693316 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.770936 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.770952 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:01 crc kubenswrapper[4823]: E1216 06:56:01.771134 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:01 crc kubenswrapper[4823]: E1216 06:56:01.771230 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.771244 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:01 crc kubenswrapper[4823]: E1216 06:56:01.771454 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.771241 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:01 crc kubenswrapper[4823]: E1216 06:56:01.771660 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.786737 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.795968 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.796060 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.796072 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.796130 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.796146 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.801911 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.814299 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.824006 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.835063 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.847858 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.863111 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.875420 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.893543 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.898008 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.898049 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.898060 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.898075 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.898085 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:01Z","lastTransitionTime":"2025-12-16T06:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.909450 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.923864 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.937014 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.956984 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.977480 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:01 crc kubenswrapper[4823]: I1216 06:56:01.992564 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.003130 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.003298 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.003314 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.003336 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.003351 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.003985 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.018104 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.105190 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.105248 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.105260 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.105276 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.105330 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.208641 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.208674 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.208684 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.208698 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.208708 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.311662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.311744 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.311768 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.311805 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.311832 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.414793 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.414838 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.414850 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.414867 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.414878 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.517711 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.517795 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.517821 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.517845 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.517862 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.621111 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.621189 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.621211 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.621235 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.621250 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.723777 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.723809 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.723817 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.723832 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.723951 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.826594 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.826626 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.826635 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.826650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.826661 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.929366 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.929396 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.929405 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.929419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:02 crc kubenswrapper[4823]: I1216 06:56:02.929427 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:02Z","lastTransitionTime":"2025-12-16T06:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.032525 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.032577 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.032594 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.032619 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.032636 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.040043 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.040097 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.040111 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.040131 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.040146 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.057969 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.063107 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.063373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.063505 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.063639 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.063781 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.079739 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.083946 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.083990 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.084003 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.084024 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.084048 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.099374 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.109469 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.109517 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.109529 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.109546 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.109556 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.124445 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.128497 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.128531 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.128560 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.128578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.128589 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.141008 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:03Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.141171 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.143702 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.143763 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.143777 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.143797 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.143811 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.246318 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.246404 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.246437 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.246466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.246486 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.348991 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.349053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.349064 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.349083 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.349095 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.452903 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.452953 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.452964 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.452983 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.452993 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.555522 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.555584 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.555604 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.555628 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.555645 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.658186 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.658250 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.658275 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.658296 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.658306 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.760537 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.760592 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.760600 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.760616 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.760627 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.770928 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.770984 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.771013 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.770927 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.771153 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.771325 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.771516 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:03 crc kubenswrapper[4823]: E1216 06:56:03.771702 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.863731 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.863789 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.863806 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.863824 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.863838 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.966653 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.966726 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.966746 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.966773 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:03 crc kubenswrapper[4823]: I1216 06:56:03.966792 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:03Z","lastTransitionTime":"2025-12-16T06:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.070086 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.070128 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.070137 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.070152 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.070160 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.173305 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.173365 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.173380 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.173399 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.173412 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.275843 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.275884 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.275892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.275908 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.275917 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.379234 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.379918 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.379973 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.379999 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.380012 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.482886 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.482932 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.482944 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.482960 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.482973 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.586375 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.586440 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.586450 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.586468 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.586479 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.688948 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.688989 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.689006 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.689043 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.689056 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.791583 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.791629 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.791642 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.791658 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.791670 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.893928 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.893967 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.893975 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.893990 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.894000 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.996425 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.996465 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.996476 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.996492 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:04 crc kubenswrapper[4823]: I1216 06:56:04.996502 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:04Z","lastTransitionTime":"2025-12-16T06:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.101457 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.101505 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.101523 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.101542 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.101561 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.206144 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.206182 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.206192 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.206208 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.206219 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.309359 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.309407 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.309419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.309437 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.309450 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.411603 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.411650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.411662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.411679 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.411688 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.514105 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.514150 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.514159 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.514174 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.514183 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.617144 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.617197 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.617211 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.617233 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.617247 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.719581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.719626 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.719636 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.719651 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.719665 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.770872 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.770982 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:05 crc kubenswrapper[4823]: E1216 06:56:05.771054 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.771103 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.771044 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:05 crc kubenswrapper[4823]: E1216 06:56:05.771184 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:05 crc kubenswrapper[4823]: E1216 06:56:05.771261 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:05 crc kubenswrapper[4823]: E1216 06:56:05.771924 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.772086 4823 scope.go:117] "RemoveContainer" containerID="a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.823277 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.823580 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.823592 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.823610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.823622 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.925496 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.925532 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.925543 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.925560 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:05 crc kubenswrapper[4823]: I1216 06:56:05.925574 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:05Z","lastTransitionTime":"2025-12-16T06:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.027971 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.028620 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.028646 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.028667 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.028682 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.120933 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/1.log" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.123169 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.124135 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.130224 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.130258 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.130269 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.130286 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.130297 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.137554 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.149944 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.165728 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.182152 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.195209 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.210934 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.225311 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.233123 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.233164 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.233173 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.233192 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.233204 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.240128 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.264423 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.277884 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.297320 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.323438 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.336279 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.336331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.336342 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.336359 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.336371 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.347683 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.362123 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.374691 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.386167 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.396901 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.439501 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.439557 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.439566 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.439583 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.439600 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.542346 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.542383 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.542393 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.542408 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.542421 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.644497 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.644532 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.644542 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.644558 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.644570 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.746679 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.746719 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.746729 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.746743 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.746754 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.848975 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.849041 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.849052 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.849070 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.849081 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.952325 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.952367 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.952380 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.952400 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:06 crc kubenswrapper[4823]: I1216 06:56:06.952413 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:06Z","lastTransitionTime":"2025-12-16T06:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.055564 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.055610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.055621 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.055637 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.055651 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.132225 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/2.log" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.133749 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/1.log" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.138384 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c" exitCode=1 Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.138459 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.138559 4823 scope.go:117] "RemoveContainer" containerID="a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.140541 4823 scope.go:117] "RemoveContainer" containerID="0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c" Dec 16 06:56:07 crc kubenswrapper[4823]: E1216 06:56:07.140935 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.158561 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.158604 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.158612 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.158629 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.158640 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.178456 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f6108d8783f74d8244ae56a7e5fb1c1835c7cf4e48c6b0aaa5e8ec4a5c843\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"message\\\":\\\"s{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:55:51.958057 6251 services_controller.go:444] Built service openshift-console/downloads LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:55:51.958062 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:55:51Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:55:51.958072 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-bwcng\\\\nI1216 06:55:51.958052 6251 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.196349 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.207198 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.224592 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.236398 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.249126 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.261531 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.261578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.261594 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.261616 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.261630 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.265817 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.278854 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.289597 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.301610 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.320751 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.330590 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.342562 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.353969 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.364221 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.364269 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.364281 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.364300 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.364312 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.365110 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.379251 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.388609 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:07Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.466728 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.466761 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.466770 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.466787 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.466842 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.570402 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.570479 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.570495 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.570523 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.570540 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.673820 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.673886 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.673903 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.673931 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.673950 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.770756 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.770811 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.770811 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:07 crc kubenswrapper[4823]: E1216 06:56:07.770914 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.770970 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:07 crc kubenswrapper[4823]: E1216 06:56:07.771062 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:07 crc kubenswrapper[4823]: E1216 06:56:07.771194 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:07 crc kubenswrapper[4823]: E1216 06:56:07.771275 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.776774 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.776811 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.776822 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.776837 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.776847 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.879289 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.879325 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.879341 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.879374 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.879395 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.982298 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.982387 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.982404 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.982426 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:07 crc kubenswrapper[4823]: I1216 06:56:07.982443 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:07Z","lastTransitionTime":"2025-12-16T06:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.085518 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.085573 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.085583 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.085600 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.085610 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.143421 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/2.log" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.146537 4823 scope.go:117] "RemoveContainer" containerID="0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c" Dec 16 06:56:08 crc kubenswrapper[4823]: E1216 06:56:08.146675 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.157811 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.171491 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.184057 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.187663 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.187711 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.187725 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.187744 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.187785 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.194605 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.205752 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.223779 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.245805 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.257769 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.275538 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.288425 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.291057 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.291111 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.291132 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.291153 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.291164 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.300197 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.310976 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.323806 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.342433 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.358692 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.367880 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.380967 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.393168 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.393401 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.393470 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.393531 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.393586 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.496070 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.496115 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.496126 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.496142 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.496152 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.598388 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.598422 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.598429 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.598444 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.598453 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.700085 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.700143 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.700158 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.700182 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.700199 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.803520 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.803561 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.803570 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.803585 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.803594 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.906386 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.906439 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.906450 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.906465 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:08 crc kubenswrapper[4823]: I1216 06:56:08.906475 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:08Z","lastTransitionTime":"2025-12-16T06:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.009362 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.009412 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.009424 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.009441 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.009451 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.111588 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.111634 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.111645 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.111661 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.111669 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.214182 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.214231 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.214245 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.214263 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.214283 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.316441 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.316474 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.316483 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.316500 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.316510 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.419266 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.419299 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.419307 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.419320 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.419329 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.489715 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:09 crc kubenswrapper[4823]: E1216 06:56:09.489908 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:09 crc kubenswrapper[4823]: E1216 06:56:09.489974 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:25.489957257 +0000 UTC m=+63.978523380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.522136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.522176 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.522187 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.522208 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.522221 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.625516 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.625553 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.625565 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.625580 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.625589 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.727690 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.727728 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.727737 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.727751 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.727759 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.770765 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.770863 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:09 crc kubenswrapper[4823]: E1216 06:56:09.770880 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:09 crc kubenswrapper[4823]: E1216 06:56:09.771010 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.771112 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:09 crc kubenswrapper[4823]: E1216 06:56:09.771171 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.771315 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:09 crc kubenswrapper[4823]: E1216 06:56:09.771457 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.830800 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.830833 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.830842 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.830857 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.830865 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.933643 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.933682 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.933691 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.933709 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:09 crc kubenswrapper[4823]: I1216 06:56:09.933720 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:09Z","lastTransitionTime":"2025-12-16T06:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.036061 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.036096 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.036104 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.036118 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.036127 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.138711 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.138763 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.138775 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.138793 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.138803 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.244957 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.245063 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.245082 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.245100 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.245110 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.348885 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.348947 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.348961 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.348980 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.348993 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.452089 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.452146 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.452160 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.452180 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.452194 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.554822 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.554859 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.554869 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.554884 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.554907 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.657527 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.657576 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.657589 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.657604 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.657615 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.759835 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.759876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.759886 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.759899 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.759909 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.862678 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.862730 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.862748 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.862767 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.862779 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.965454 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.965497 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.965507 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.965520 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:10 crc kubenswrapper[4823]: I1216 06:56:10.965529 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:10Z","lastTransitionTime":"2025-12-16T06:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.068398 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.068689 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.068806 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.068893 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.068986 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.170992 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.171056 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.171068 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.171085 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.171098 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.274366 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.274414 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.274431 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.274449 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.274462 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.377037 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.377495 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.377592 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.377699 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.377778 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.480073 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.480108 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.480119 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.480136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.480147 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.582612 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.583586 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.583678 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.583768 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.583846 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.613455 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.613579 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.613602 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613693 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:56:43.613657986 +0000 UTC m=+82.102224139 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613707 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613780 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613808 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:43.6137938 +0000 UTC m=+82.102360003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613833 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:43.613818571 +0000 UTC m=+82.102384694 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.613865 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.613913 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613944 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613964 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.613977 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.614042 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:43.614012027 +0000 UTC m=+82.102578150 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.614073 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.614090 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.614103 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.614145 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:43.614133291 +0000 UTC m=+82.102699494 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.686787 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.687115 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.687511 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.687847 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.688132 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.771137 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.771293 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.771619 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.771706 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.771787 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.771847 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.771905 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:11 crc kubenswrapper[4823]: E1216 06:56:11.771961 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.790744 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.790801 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.790813 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.790829 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.790838 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.798484 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.821276 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.833995 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.845951 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.861043 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.872748 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.886497 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.893348 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.893391 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.893399 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.893449 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.893459 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.896908 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.904873 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.916237 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.926346 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.937764 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.948797 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.958847 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.969057 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.981938 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.991160 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:11Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.995486 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.995534 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.995547 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.995568 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:11 crc kubenswrapper[4823]: I1216 06:56:11.995580 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:11Z","lastTransitionTime":"2025-12-16T06:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.098529 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.098607 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.098624 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.098652 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.098671 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.201540 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.202353 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.202447 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.202583 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.202787 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.305468 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.305751 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.305837 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.305970 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.306086 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.409168 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.409219 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.409230 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.409248 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.409258 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.512996 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.513460 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.513570 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.513746 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.513840 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.616642 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.617778 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.617845 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.617940 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.618075 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.721074 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.721131 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.721142 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.721165 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.721180 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.823959 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.824001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.824011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.824053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.824068 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.926870 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.926922 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.926939 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.926956 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:12 crc kubenswrapper[4823]: I1216 06:56:12.926969 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:12Z","lastTransitionTime":"2025-12-16T06:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.030357 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.030752 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.030849 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.030943 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.031065 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.136596 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.136651 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.136659 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.136678 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.136688 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.150141 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.150209 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.150227 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.150249 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.150260 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.165212 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:13Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.169337 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.169395 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.169409 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.169431 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.169444 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.183967 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:13Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.188916 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.188955 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.188970 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.188988 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.188998 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.202044 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:13Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.206867 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.207162 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.207270 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.207366 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.207453 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.220879 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:13Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.224803 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.224836 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.224848 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.224864 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.224874 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.236590 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:13Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.236713 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.238840 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.238867 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.238879 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.238895 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.238909 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.341309 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.341358 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.341373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.341390 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.341402 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.443887 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.443942 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.443951 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.443966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.443978 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.546682 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.546734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.546747 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.546765 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.546777 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.649983 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.650050 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.650062 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.650077 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.650086 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.753798 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.753843 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.753854 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.753872 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.753884 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.771417 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.771485 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.771521 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.771501 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.771615 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.771653 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.771695 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:13 crc kubenswrapper[4823]: E1216 06:56:13.771731 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.856133 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.856178 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.856194 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.856211 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.856222 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.958889 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.958934 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.958944 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.958959 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:13 crc kubenswrapper[4823]: I1216 06:56:13.958968 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:13Z","lastTransitionTime":"2025-12-16T06:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.061950 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.061991 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.062000 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.062017 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.062042 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.168500 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.168652 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.168669 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.168688 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.168699 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.271001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.271063 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.271078 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.271095 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.271107 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.374330 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.374373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.374383 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.374400 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.374412 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.477729 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.477764 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.477773 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.477788 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.477800 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.581102 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.581154 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.581168 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.581193 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.581207 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.683622 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.683675 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.683687 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.683705 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.683718 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.786702 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.786756 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.786765 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.786783 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.786794 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.889041 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.889358 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.889443 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.889513 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.889575 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.917419 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.927265 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.932294 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.945424 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.955704 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.966171 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.975600 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.987350 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.991550 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.991573 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.991582 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.991598 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.991628 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:14Z","lastTransitionTime":"2025-12-16T06:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:14 crc kubenswrapper[4823]: I1216 06:56:14.999574 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:14Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.009510 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.019122 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.029967 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.039943 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.056178 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.067662 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.087784 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.094531 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.094588 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.094598 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.094620 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.094631 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.100342 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.111387 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.121126 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:15Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.196470 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.196511 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.196523 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.196538 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.196549 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.299887 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.299938 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.299977 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.299995 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.300008 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.402763 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.402820 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.402835 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.402858 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.402871 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.504762 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.504816 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.504829 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.504846 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.504857 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.611216 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.611256 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.611270 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.611286 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.611296 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.713422 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.713491 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.713509 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.713538 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.713556 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.770549 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.770588 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.770645 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.770709 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:15 crc kubenswrapper[4823]: E1216 06:56:15.770707 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:15 crc kubenswrapper[4823]: E1216 06:56:15.770807 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:15 crc kubenswrapper[4823]: E1216 06:56:15.770976 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:15 crc kubenswrapper[4823]: E1216 06:56:15.771127 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.815765 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.815817 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.815832 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.815854 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.815870 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.918435 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.918475 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.918486 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.918503 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:15 crc kubenswrapper[4823]: I1216 06:56:15.918512 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:15Z","lastTransitionTime":"2025-12-16T06:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.021298 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.021339 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.021352 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.021368 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.021380 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.124466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.124517 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.124532 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.124551 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.124562 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.227373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.227401 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.227409 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.227422 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.227433 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.329700 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.329756 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.329813 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.329833 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.329844 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.432323 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.432696 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.432830 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.432971 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.433141 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.536546 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.536961 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.537205 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.537667 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.537999 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.640562 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.640612 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.640631 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.640650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.640664 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.743675 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.743716 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.743726 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.743742 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.743751 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.846542 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.846578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.846588 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.846636 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.846648 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.949378 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.949419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.949429 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.949444 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:16 crc kubenswrapper[4823]: I1216 06:56:16.949455 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:16Z","lastTransitionTime":"2025-12-16T06:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.052289 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.052339 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.052352 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.052373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.052387 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.155186 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.155506 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.155619 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.155751 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.155923 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.258624 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.258684 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.258708 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.258732 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.258747 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.360957 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.360990 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.360998 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.361011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.361042 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.463633 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.463664 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.463672 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.463707 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.463717 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.566195 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.566260 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.566271 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.566284 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.566293 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.668500 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.668823 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.668906 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.669017 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.669163 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.770600 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.770629 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.770653 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:17 crc kubenswrapper[4823]: E1216 06:56:17.770781 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.770602 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:17 crc kubenswrapper[4823]: E1216 06:56:17.771046 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:17 crc kubenswrapper[4823]: E1216 06:56:17.771149 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:17 crc kubenswrapper[4823]: E1216 06:56:17.771247 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.772471 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.772611 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.772722 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.772912 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.773101 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.876823 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.877212 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.877394 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.877480 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.877564 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.980348 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.980385 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.980393 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.980406 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:17 crc kubenswrapper[4823]: I1216 06:56:17.980414 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:17Z","lastTransitionTime":"2025-12-16T06:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.082836 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.083141 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.083237 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.083331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.083431 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.186823 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.186866 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.186876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.186892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.186903 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.289412 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.289460 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.289471 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.289491 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.289503 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.391936 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.391999 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.392010 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.392036 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.392045 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.494810 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.495466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.495617 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.495818 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.495966 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.598728 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.598780 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.598788 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.598806 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.598816 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.701229 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.701295 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.701319 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.701351 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.701377 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.804416 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.804979 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.805070 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.805136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.805252 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.907892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.907928 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.907939 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.907956 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:18 crc kubenswrapper[4823]: I1216 06:56:18.907968 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:18Z","lastTransitionTime":"2025-12-16T06:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.010355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.010404 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.010416 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.010433 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.010444 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.112814 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.112859 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.112870 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.112886 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.112894 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.215450 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.215493 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.215502 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.215518 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.215527 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.317875 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.318203 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.318307 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.318398 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.318488 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.421398 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.421459 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.421468 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.421486 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.421500 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.523900 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.523949 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.523960 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.523979 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.523992 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.626682 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.626723 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.626734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.626759 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.626777 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.729070 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.729138 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.729149 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.729167 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.729179 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.770901 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:19 crc kubenswrapper[4823]: E1216 06:56:19.771113 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.771325 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.771340 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:19 crc kubenswrapper[4823]: E1216 06:56:19.771496 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.771537 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:19 crc kubenswrapper[4823]: E1216 06:56:19.771888 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:19 crc kubenswrapper[4823]: E1216 06:56:19.771971 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.772315 4823 scope.go:117] "RemoveContainer" containerID="0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c" Dec 16 06:56:19 crc kubenswrapper[4823]: E1216 06:56:19.772524 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.831332 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.831376 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.831387 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.831405 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.831417 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.933670 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.933717 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.933734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.933758 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:19 crc kubenswrapper[4823]: I1216 06:56:19.933782 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:19Z","lastTransitionTime":"2025-12-16T06:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.039302 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.039358 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.039375 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.039394 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.039405 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.141959 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.141991 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.142035 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.142051 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.142077 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.244184 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.244221 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.244230 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.244245 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.244257 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.346452 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.346494 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.346504 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.346525 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.346536 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.449295 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.449330 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.449339 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.449355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.449365 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.552621 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.552675 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.552696 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.552726 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.552749 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.654908 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.654953 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.654961 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.654978 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.654988 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.756945 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.756997 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.757008 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.757041 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.757054 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.859492 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.859528 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.859537 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.859550 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.859562 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.962527 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.962576 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.962587 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.962614 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:20 crc kubenswrapper[4823]: I1216 06:56:20.962624 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:20Z","lastTransitionTime":"2025-12-16T06:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.065316 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.065372 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.065384 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.065401 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.065416 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.168019 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.168128 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.168151 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.168185 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.168211 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.271530 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.271591 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.271647 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.271671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.271684 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.374294 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.374349 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.374362 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.374381 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.374400 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.478230 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.478302 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.478321 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.478353 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.478377 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.582020 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.582094 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.582107 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.582126 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.582138 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.686050 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.686135 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.686154 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.686178 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.686199 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.770691 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:21 crc kubenswrapper[4823]: E1216 06:56:21.770847 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.770866 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:21 crc kubenswrapper[4823]: E1216 06:56:21.770937 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.771246 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:21 crc kubenswrapper[4823]: E1216 06:56:21.771317 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.771376 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:21 crc kubenswrapper[4823]: E1216 06:56:21.771433 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.789976 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.790764 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.791058 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.791085 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.791110 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.791125 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.808331 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.822707 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.837049 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.850440 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.864893 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.884837 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.895416 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.895482 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.895496 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.895535 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.895550 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.898951 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.912528 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.933944 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.956233 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.971891 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.984667 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.997569 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.998637 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.998688 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.998701 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.998720 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:21 crc kubenswrapper[4823]: I1216 06:56:21.998732 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:21Z","lastTransitionTime":"2025-12-16T06:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.009562 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.023227 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.034371 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.045468 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.101062 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.101106 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.101123 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.101145 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.101158 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.203468 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.203526 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.203539 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.203556 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.203569 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.306464 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.306516 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.306531 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.306548 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.306558 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.409506 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.409554 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.409564 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.409581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.409592 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.511745 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.511788 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.511797 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.511812 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.511824 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.614475 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.614526 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.614538 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.614558 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.614570 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.717578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.717627 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.717640 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.717664 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.717679 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.820098 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.820175 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.820192 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.820218 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.820275 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.923618 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.923662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.923673 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.923697 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:22 crc kubenswrapper[4823]: I1216 06:56:22.923711 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:22Z","lastTransitionTime":"2025-12-16T06:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.027145 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.027183 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.027192 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.027207 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.027217 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.130879 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.130961 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.130986 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.131019 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.131092 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.234639 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.234690 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.234704 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.234721 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.234735 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.306553 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.306614 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.306630 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.306650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.306664 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.321432 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.326243 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.326283 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.326298 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.326322 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.326341 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.342126 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.347178 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.347226 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.347237 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.347254 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.347265 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.364000 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.371011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.371160 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.371183 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.371261 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.371291 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.392767 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.398369 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.398415 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.398434 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.398456 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.398471 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.420720 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.421238 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.424454 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.424517 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.424536 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.424562 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.424582 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.527053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.527091 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.527102 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.527118 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.527129 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.629588 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.629643 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.629654 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.629671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.629682 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.733214 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.733270 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.733284 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.733308 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.733325 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.771069 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.771285 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.771179 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.771179 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.771556 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.771697 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.771791 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:23 crc kubenswrapper[4823]: E1216 06:56:23.771856 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.836244 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.836283 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.836298 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.836316 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.836327 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.939598 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.939671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.939691 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.939721 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:23 crc kubenswrapper[4823]: I1216 06:56:23.939744 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:23Z","lastTransitionTime":"2025-12-16T06:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.042624 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.042678 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.042688 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.042705 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.042713 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.145690 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.146203 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.146351 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.146513 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.146658 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.249112 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.249150 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.249160 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.249175 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.249187 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.351889 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.352004 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.352014 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.352043 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.352055 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.454706 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.454755 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.454767 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.454787 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.454800 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.557556 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.557599 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.557610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.557632 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.557644 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.660228 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.660608 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.660712 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.660824 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.660923 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.763846 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.764180 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.764290 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.764378 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.764450 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.866590 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.866632 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.866644 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.866660 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.866672 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.969057 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.969129 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.969152 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.969182 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:24 crc kubenswrapper[4823]: I1216 06:56:24.969204 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:24Z","lastTransitionTime":"2025-12-16T06:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.071951 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.071988 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.071996 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.072011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.072037 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.174844 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.174892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.174904 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.174926 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.174939 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.277504 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.277542 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.277553 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.277568 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.277580 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.380352 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.380413 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.380433 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.380458 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.380479 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.484045 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.484096 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.484108 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.484127 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.484141 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.561802 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:25 crc kubenswrapper[4823]: E1216 06:56:25.562008 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:25 crc kubenswrapper[4823]: E1216 06:56:25.562148 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:56:57.562122371 +0000 UTC m=+96.050688524 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.586372 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.586408 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.586416 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.586429 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.586441 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.689332 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.689393 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.689404 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.689423 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.689436 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.771167 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.771323 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:25 crc kubenswrapper[4823]: E1216 06:56:25.771513 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.771803 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.771859 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:25 crc kubenswrapper[4823]: E1216 06:56:25.771945 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:25 crc kubenswrapper[4823]: E1216 06:56:25.772548 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:25 crc kubenswrapper[4823]: E1216 06:56:25.772642 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.792528 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.792591 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.792605 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.792624 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.792992 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.895722 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.895767 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.895777 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.895795 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.895806 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.998657 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.998708 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.998719 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.998734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:25 crc kubenswrapper[4823]: I1216 06:56:25.998744 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:25Z","lastTransitionTime":"2025-12-16T06:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.102284 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.102321 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.102331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.102347 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.102358 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.205375 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.205413 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.205424 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.205450 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.205466 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.308382 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.308411 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.308419 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.308433 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.308444 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.411308 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.411406 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.411433 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.411475 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.411503 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.514341 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.514385 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.514398 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.514416 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.514430 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.617138 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.617189 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.617201 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.617222 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.617235 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.719440 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.719488 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.719500 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.719519 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.719533 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.822681 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.823203 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.823221 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.823246 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.823259 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.925158 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.925203 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.925214 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.925231 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:26 crc kubenswrapper[4823]: I1216 06:56:26.925242 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:26Z","lastTransitionTime":"2025-12-16T06:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.028486 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.028539 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.028554 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.028573 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.028587 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.131012 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.131080 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.131089 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.131108 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.131118 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.233087 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.233126 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.233137 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.233151 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.233160 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.335250 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.335300 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.335312 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.335330 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.335341 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.437376 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.437426 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.437439 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.437457 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.437470 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.540278 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.540329 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.540338 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.540354 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.540363 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.642946 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.643007 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.643041 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.643063 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.643078 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.745725 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.745763 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.745772 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.745786 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.745795 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.771463 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.771486 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.771512 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.771490 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:27 crc kubenswrapper[4823]: E1216 06:56:27.771631 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:27 crc kubenswrapper[4823]: E1216 06:56:27.771737 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:27 crc kubenswrapper[4823]: E1216 06:56:27.771881 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:27 crc kubenswrapper[4823]: E1216 06:56:27.772014 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.848393 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.848442 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.848455 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.848473 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.848486 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.951050 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.951100 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.951111 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.951133 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:27 crc kubenswrapper[4823]: I1216 06:56:27.951144 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:27Z","lastTransitionTime":"2025-12-16T06:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.053447 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.053497 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.053509 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.053533 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.053543 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.156541 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.156581 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.156596 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.156622 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.156639 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.258473 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.258504 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.258513 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.258526 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.258535 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.365605 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.365653 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.365665 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.365684 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.365696 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.468634 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.468669 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.468676 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.468689 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.468698 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.571625 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.571680 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.571691 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.571711 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.571724 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.674552 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.674600 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.674611 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.674628 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.674639 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.777355 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.777392 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.777402 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.777415 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.777424 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.879661 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.879700 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.879709 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.879725 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.879735 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.982255 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.982299 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.982310 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.982328 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:28 crc kubenswrapper[4823]: I1216 06:56:28.982341 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:28Z","lastTransitionTime":"2025-12-16T06:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.084504 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.084546 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.084556 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.084572 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.084585 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.186855 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.186903 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.186913 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.186933 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.186945 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.220787 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/0.log" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.220846 4823 generic.go:334] "Generic (PLEG): container finished" podID="1b377757-dbc6-4d9c-9656-3ff65d7d113a" containerID="78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc" exitCode=1 Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.220883 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerDied","Data":"78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.221340 4823 scope.go:117] "RemoveContainer" containerID="78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.240856 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.255221 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.268096 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.280743 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.289780 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.289823 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.289865 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.289894 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.289906 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.292679 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.305863 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.318535 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.329917 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.341015 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.352732 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.367631 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.379272 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.392542 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.392578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.392590 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.392605 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.392615 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.392818 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.404082 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.416252 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.425798 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.435839 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.453520 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.495337 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.495373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.495381 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.495396 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.495405 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.597775 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.597832 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.597840 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.597855 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.597865 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.701939 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.701982 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.701994 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.702011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.702045 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.774382 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.774430 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.774514 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.774656 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:29 crc kubenswrapper[4823]: E1216 06:56:29.774641 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:29 crc kubenswrapper[4823]: E1216 06:56:29.774828 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:29 crc kubenswrapper[4823]: E1216 06:56:29.775206 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:29 crc kubenswrapper[4823]: E1216 06:56:29.775311 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.809503 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.809541 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.809550 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.809564 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.809574 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.911759 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.911801 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.911814 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.911832 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:29 crc kubenswrapper[4823]: I1216 06:56:29.911846 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:29Z","lastTransitionTime":"2025-12-16T06:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.016253 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.016305 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.016316 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.016473 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.016493 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.126311 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.126353 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.126363 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.126379 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.126390 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.227258 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/0.log" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.227331 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerStarted","Data":"93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.228605 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.228660 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.228676 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.228696 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.228709 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.250159 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.264667 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.283985 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.296408 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.310737 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.328318 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.333828 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.333882 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.333894 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.333913 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.333925 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.352313 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.369926 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.383584 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.399001 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.412722 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.427083 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.436522 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.436562 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.436574 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.436589 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.436603 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.443333 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.459943 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.474896 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.491454 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.507821 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.520109 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.538722 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.538896 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.538979 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.539076 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.539158 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.642174 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.642230 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.642241 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.642257 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.642268 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.745836 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.745916 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.745937 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.745964 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.745989 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.848825 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.848873 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.848882 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.848899 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.848910 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.951207 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.951257 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.951266 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.951285 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:30 crc kubenswrapper[4823]: I1216 06:56:30.951297 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:30Z","lastTransitionTime":"2025-12-16T06:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.053966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.054012 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.054036 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.054052 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.054063 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.156414 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.156471 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.156482 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.156500 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.156510 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.258734 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.258776 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.258784 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.258803 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.258813 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.361969 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.362039 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.362053 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.362076 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.362090 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.465499 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.465546 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.465556 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.465571 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.465581 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.567806 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.567860 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.567878 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.567898 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.567912 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.670703 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.670743 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.670754 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.670769 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.670779 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.771306 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:31 crc kubenswrapper[4823]: E1216 06:56:31.771477 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.771538 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.771553 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:31 crc kubenswrapper[4823]: E1216 06:56:31.771813 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:31 crc kubenswrapper[4823]: E1216 06:56:31.771598 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.771908 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:31 crc kubenswrapper[4823]: E1216 06:56:31.772010 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.772467 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.772492 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.772500 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.772511 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.772520 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.783317 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.799854 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.817182 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.831451 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.847144 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.859010 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.869555 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.874357 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.874416 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.874428 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.874450 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.874466 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.882368 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.894558 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.906099 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.923365 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.937139 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.949982 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.962073 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.974723 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.977590 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.977643 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.977654 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.977671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.977682 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:31Z","lastTransitionTime":"2025-12-16T06:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:31 crc kubenswrapper[4823]: I1216 06:56:31.986320 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.002853 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.016470 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.080636 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.080686 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.080698 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.080719 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.080730 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.184517 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.184578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.184590 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.184609 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.184623 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.288202 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.288250 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.288263 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.288292 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.288306 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.390595 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.390673 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.390692 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.390720 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.390737 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.493081 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.493136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.493148 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.493195 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.493206 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.596002 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.596100 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.596114 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.596140 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.596153 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.698343 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.698392 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.698406 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.698425 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.698436 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.772111 4823 scope.go:117] "RemoveContainer" containerID="0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.800918 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.801358 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.801373 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.801393 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.801405 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.904325 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.904374 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.904386 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.904411 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:32 crc kubenswrapper[4823]: I1216 06:56:32.904426 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:32Z","lastTransitionTime":"2025-12-16T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.007072 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.007122 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.007135 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.007155 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.007168 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.133619 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.133660 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.133671 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.133690 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.133703 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.236680 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.236726 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.236736 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.236754 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.236767 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.242856 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/2.log" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.245473 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.245993 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.261300 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.288936 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.302563 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.314064 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.332926 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.338676 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.338714 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.338724 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.338739 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.338750 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.351803 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.382498 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.416893 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.440724 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.440764 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.440775 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.440790 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.440801 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.466272 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.483552 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.483596 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.483609 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.483628 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.483641 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.488554 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.510597 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.511486 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.519459 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.519490 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.519499 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.519514 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.519524 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.527995 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.535127 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.538086 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.538174 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.538187 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.538203 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.538213 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.538266 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.547951 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.549646 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.551060 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.551088 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.551100 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.551117 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.551129 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.561119 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.563486 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.563894 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.564054 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.564068 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.564088 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.564098 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.576810 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.576929 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2caa91d7-bd83-4de0-9038-0514886c6d71\\\",\\\"systemUUID\\\":\\\"b35231f6-d02a-487d-8117-57547d768cbe\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.577086 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.578707 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.578738 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.578746 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.578760 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.578769 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.593004 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.605861 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.681242 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.681277 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.681287 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.681304 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.681316 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.771247 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.771293 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.771332 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.771307 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.771394 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.771511 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.771536 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:33 crc kubenswrapper[4823]: E1216 06:56:33.771585 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.783819 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.783855 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.783870 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.783889 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.783902 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.886573 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.886618 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.886630 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.886646 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.886656 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.990397 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.990455 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.990470 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.990490 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:33 crc kubenswrapper[4823]: I1216 06:56:33.990504 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:33Z","lastTransitionTime":"2025-12-16T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.093981 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.094080 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.094096 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.094122 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.094138 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.197011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.197072 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.197081 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.197100 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.197110 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.251541 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/3.log" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.252224 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/2.log" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.255411 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" exitCode=1 Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.255505 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.255629 4823 scope.go:117] "RemoveContainer" containerID="0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.256185 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 06:56:34 crc kubenswrapper[4823]: E1216 06:56:34.256439 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.277540 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.294626 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.303662 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.303711 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.303724 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.303744 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.303758 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.311214 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.322970 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.332622 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.343974 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.358764 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.375353 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.390399 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.402766 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.406338 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.406380 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.406390 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.406405 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.406414 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.412338 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.423349 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.442578 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6ba9c9aa8e7822590e52644019034f71acb4dbc336efacda606a8be00a05c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:07Z\\\",\\\"message\\\":\\\"06:56:06.827886 6449 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-964hc\\\\nI1216 06:56:06.827888 6449 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk in node crc\\\\nF1216 06:56:06.827887 6449 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:06Z is after 2025-08-24T17:21:41Z]\\\\nI1216 06:56:06.827901 6449 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-964hc in node crc\\\\nI1216 06:56:06.827899 6449 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-zwjhk after 0 fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:34Z\\\",\\\"message\\\":\\\"ces_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.161\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:56:33.955106 6853 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1216 06:56:33.954736 6853 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-8mn7l\\\\nI1216 06:56:33.954801 6853 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-n248g in node crc\\\\nI1216 06:56:33.955141 6853 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-n248g after 0 failed attempt(s)\\\\nF1216 06:56:33.955147 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.453306 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.472518 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.487056 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.503067 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.509103 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.509145 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.509157 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.509175 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.509189 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.514088 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.611284 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.611320 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.611331 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.611346 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.611358 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.713615 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.713663 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.713673 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.713722 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.713738 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.816114 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.816169 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.816187 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.816212 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.816231 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.918630 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.918672 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.918683 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.918700 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:34 crc kubenswrapper[4823]: I1216 06:56:34.918712 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:34Z","lastTransitionTime":"2025-12-16T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.021006 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.021130 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.021158 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.021183 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.021202 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.123939 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.123989 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.124004 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.124051 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.124064 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.227142 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.227194 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.227211 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.227236 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.227253 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.261262 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/3.log" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.265413 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 06:56:35 crc kubenswrapper[4823]: E1216 06:56:35.265636 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.279754 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.301657 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:34Z\\\",\\\"message\\\":\\\"ces_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.161\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:56:33.955106 6853 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1216 06:56:33.954736 6853 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-8mn7l\\\\nI1216 06:56:33.954801 6853 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-n248g in node crc\\\\nI1216 06:56:33.955141 6853 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-n248g after 0 failed attempt(s)\\\\nF1216 06:56:33.955147 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.314654 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.329825 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.329863 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.329877 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.329895 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.329909 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.338813 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.359347 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.372569 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.385957 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.401720 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.416692 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.427640 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.432878 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.432910 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.432921 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.432938 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.432950 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.438626 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.448318 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.460430 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.476117 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.488255 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.500245 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.511143 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.519699 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.535180 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.535219 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.535227 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.535245 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.535256 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.637556 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.637591 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.637601 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.637616 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.637626 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.740001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.740057 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.740072 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.740088 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.740100 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.770834 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.770836 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.770890 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.770904 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:35 crc kubenswrapper[4823]: E1216 06:56:35.771060 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:35 crc kubenswrapper[4823]: E1216 06:56:35.771138 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:35 crc kubenswrapper[4823]: E1216 06:56:35.771222 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:35 crc kubenswrapper[4823]: E1216 06:56:35.771389 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.843951 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.843994 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.844006 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.844045 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.844056 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.946553 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.946593 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.946604 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.946622 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:35 crc kubenswrapper[4823]: I1216 06:56:35.946633 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:35Z","lastTransitionTime":"2025-12-16T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.049458 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.049485 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.049493 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.049506 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.049515 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.152214 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.152256 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.152267 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.152281 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.152290 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.254578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.254623 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.254635 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.254650 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.254661 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.356992 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.357039 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.357049 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.357062 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.357070 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.459743 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.459773 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.459781 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.459794 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.459802 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.562646 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.562684 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.562693 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.562707 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.562716 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.665196 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.665240 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.665251 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.665267 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.665281 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.768752 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.768814 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.768850 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.768875 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.768892 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.785893 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.871940 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.872065 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.872077 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.872094 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.872107 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.973938 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.973977 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.973991 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.974006 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:36 crc kubenswrapper[4823]: I1216 06:56:36.974017 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:36Z","lastTransitionTime":"2025-12-16T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.076540 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.076585 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.076596 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.076610 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.076619 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.179468 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.179984 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.180013 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.180080 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.180098 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.283136 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.283198 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.283215 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.283239 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.283257 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.385549 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.385595 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.385607 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.385625 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.385638 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.488174 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.488231 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.488242 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.488260 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.488274 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.591495 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.591548 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.591560 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.591578 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.591596 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.694176 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.694229 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.694242 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.694261 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.694277 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.771372 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.771441 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:37 crc kubenswrapper[4823]: E1216 06:56:37.771541 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.771592 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:37 crc kubenswrapper[4823]: E1216 06:56:37.771686 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.771383 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:37 crc kubenswrapper[4823]: E1216 06:56:37.771757 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:37 crc kubenswrapper[4823]: E1216 06:56:37.771805 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.797209 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.797254 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.797265 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.797286 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.797299 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.899750 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.899797 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.899817 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.899836 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:37 crc kubenswrapper[4823]: I1216 06:56:37.899848 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:37Z","lastTransitionTime":"2025-12-16T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.002503 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.002539 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.002547 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.002561 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.002571 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.105222 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.105266 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.105277 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.105293 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.105305 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.208115 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.208168 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.208179 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.208197 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.208208 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.310210 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.310246 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.310256 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.310273 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.310283 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.412603 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.412645 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.412657 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.412679 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.412690 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.514773 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.514809 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.514820 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.514838 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.514850 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.617187 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.617253 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.617270 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.617309 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.617322 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.720471 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.720908 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.721018 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.721211 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.721241 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.825109 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.825166 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.825182 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.825204 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.825217 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.928330 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.928381 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.928392 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.928411 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:38 crc kubenswrapper[4823]: I1216 06:56:38.928421 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:38Z","lastTransitionTime":"2025-12-16T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.031839 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.031919 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.031945 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.031981 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.032004 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.135667 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.135812 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.135844 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.135876 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.135897 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.238485 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.238651 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.238672 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.238703 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.238724 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.341879 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.341932 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.341946 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.341967 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.341979 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.444506 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.444544 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.444554 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.444569 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.444579 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.546962 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.546991 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.547002 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.547017 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.547040 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.650087 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.650144 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.650156 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.650176 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.650188 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.753656 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.753758 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.753776 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.753837 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.753857 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.771009 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.771136 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.771093 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:39 crc kubenswrapper[4823]: E1216 06:56:39.771263 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:39 crc kubenswrapper[4823]: E1216 06:56:39.771538 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.771582 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:39 crc kubenswrapper[4823]: E1216 06:56:39.771578 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:39 crc kubenswrapper[4823]: E1216 06:56:39.771768 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.857076 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.857121 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.857131 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.857146 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.857157 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.961409 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.961515 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.961540 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.961580 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:39 crc kubenswrapper[4823]: I1216 06:56:39.961604 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:39Z","lastTransitionTime":"2025-12-16T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.064915 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.064986 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.065002 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.065062 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.065081 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.168774 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.168846 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.168859 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.168879 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.168890 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.272209 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.272275 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.272291 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.272318 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.272332 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.375104 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.375155 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.375168 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.375186 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.375199 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.478099 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.478202 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.478230 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.478266 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.478292 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.581430 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.581490 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.581505 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.581523 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.581536 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.685255 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.685339 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.685358 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.685390 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.685412 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.788795 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.788872 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.788892 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.788924 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.788945 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.892086 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.892150 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.892171 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.892196 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.892217 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.994966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.995037 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.995051 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.995069 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:40 crc kubenswrapper[4823]: I1216 06:56:40.995082 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:40Z","lastTransitionTime":"2025-12-16T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.097950 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.097990 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.098001 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.098043 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.098056 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.200741 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.200792 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.200805 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.200822 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.200834 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.302670 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.302738 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.302747 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.302763 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.302771 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.406314 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.406388 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.406400 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.406422 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.406435 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.509143 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.509193 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.509203 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.509226 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.509238 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.612455 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.612495 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.612693 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.612713 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.612723 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.714939 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.714998 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.715011 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.715109 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.715128 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.770828 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.770836 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.771014 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.771091 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:41 crc kubenswrapper[4823]: E1216 06:56:41.771189 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:41 crc kubenswrapper[4823]: E1216 06:56:41.771272 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:41 crc kubenswrapper[4823]: E1216 06:56:41.771412 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:41 crc kubenswrapper[4823]: E1216 06:56:41.771510 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.790224 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.803828 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba735c509770e9d43fe8152b2a42dc4123d0d797c7d41d5e3efe39a6cef74d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09486327bad82b741ca9260e306f84ed9f0fbc7058bb06267042b8d2cd5f818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.817047 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.818846 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.818939 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.818955 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.818980 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.819189 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.832745 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72dfe197d43de2945a2ee22789ab86205c77588c4f4002e2473cef09e7b4b10b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.847294 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hr8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d31d032-9142-4e26-9f06-e3a5ea73d530\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://502e1707b8f53950b8f10fa6b289c9d235d6596fb6eec7e4d69c3158823f8e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sdtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hr8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.863877 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n248g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b377757-dbc6-4d9c-9656-3ff65d7d113a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:28Z\\\",\\\"message\\\":\\\"2025-12-16T06:55:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5\\\\n2025-12-16T06:55:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcb33422-1d47-454e-9c64-2a3d50e3c0a5 to /host/opt/cni/bin/\\\\n2025-12-16T06:55:43Z [verbose] multus-daemon started\\\\n2025-12-16T06:55:43Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:56:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2skkc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n248g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.882372 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-964hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a95a09e1-8457-4d5e-b1b6-dd6f59a66c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bb1ede33a8f71c9116cfb9401b1f78a6aca07d580e4abe93370b470d6b20284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b305ee45caf9aff77bb738777dd9b2ebd2e3e63727b460eae22f8f66d1e6cec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9d77b125c0d081148c1892229c933414ac8e1b8e0371f0932512deba1107e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2048aefee6601e678e26821b3ed329aead74791b34654ef5dde9fe261e37b835\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c574f5f8d3f82a7a21c7f8e49e7927387be9e4029e28ba265d39ea058fb6a60f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e32bdec2a02013c087ccc13325d44061d85bcf9a352f2dc31ed506f11db6428\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91427f2f713f7e1c2ba16e87252e2fd0479a31e6f94884ad85fe9f99ab84bedd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5scf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-964hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.894678 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8316198-2610-4b50-9e00-f54f70c31372\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461527ca73f110395f3690515a2bba303b648297d2e25f5174a6dc4a9a5b591a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173f7590ff6fb04354f0bebec8ba8bbe2d4254d9c535920b0d9adf13a8d72393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173f7590ff6fb04354f0bebec8ba8bbe2d4254d9c535920b0d9adf13a8d72393\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.906598 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bwcng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff057ef-c324-4465-8b8d-c7b98c25b23c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea36a181f0874fbab3022e6ce27567b65eb8c01cdb2925b08d9c0782af7e93c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2bj4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bwcng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.922276 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.922310 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.922321 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.922336 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.922348 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:41Z","lastTransitionTime":"2025-12-16T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.930042 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08e48f89-7095-4ea2-afb5-759591c2b0d4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:56:34Z\\\",\\\"message\\\":\\\"ces_controller.go:443] Built service openshift-config-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.161\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:56:33.955106 6853 services_controller.go:454] Service openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1216 06:56:33.954736 6853 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-8mn7l\\\\nI1216 06:56:33.954801 6853 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-n248g in node crc\\\\nI1216 06:56:33.955141 6853 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-n248g after 0 failed attempt(s)\\\\nF1216 06:56:33.955147 6853 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:56:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzgs7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwjhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.943163 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f23abdf3-bae6-4239-b0b0-2cb717be2ffb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:56:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf31c05ee4aae2a94ca03ae1c3af6a8e748104346b05bc56a75bffd14c4ef993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59b38847905d672f7a59b8a77a7def857a70111c48d5f7d06180f91a42ae79d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b4d125316b63df68beb204f3618d0d82c2646f505297c24d07a584732e19f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d129a77e2eea43c6bd7305f37db4e804cc07e039ce90258fc9d239dbd12aa9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.958520 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e388fa291257a537c122a1f3f0eb6a60eb74e4bcaaf32c4fe829be2da1bfdae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.972196 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.984517 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dec47c-3043-486c-b371-2be103c214e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2bf5eb4e2f587f7084f731ca681a116313b1014b02cdb391b09fdcdd42600c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmpf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fv56f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:41 crc kubenswrapper[4823]: I1216 06:56:41.995387 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42574a3f-0701-4192-b16c-bdb9be6c2888\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a26eba39f7b07ed59337e21910571235eaa797f43231f4962869caf58a9515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e305106870ce1b62af84310a83d4ba0d529312470315b95ae7e41e4f0d378e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t54x6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v5mgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.014359 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3de3a2ad-dc6c-47ba-af8d-4f128e025aad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c40d3d73f70c8983adc8d076d89864e7224528dae97252396a1c34bd4cb804e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb40e9d22674554f2a267df7c9f924a29131ac986877bf19572cc5992ba396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c22b0787554b4bdf0b0068fc696b8515e2ee63affef23e5eb64f77bc32a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32894b8f22c5335ed585c26fcab727324803d768943f40455a592025cbfd0ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1560fb0ba0c40a2797688b518c22c9164a8cbe9a265cb2ad95408ba86b0fb537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3d6894d867564eea62c7d78dd6ca62a9913787ba06b786722e478785ba796ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e551b627e90b56385fe5618d3e66ea7376a2d574589583c005373aa29db1d410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631458d415d64c3cd4dfd24771644393aa6c01288de8f7a5395cf8888db67d2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.024491 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.024840 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.024872 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.024895 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.024909 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.034149 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4c70365-dff4-4b29-af25-657fd9823db4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59ddc7310ed1600d050e488762436763220f3c2946a4c2020346da4ccef46b1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7fa96323e345f8891f51500618a7b07d128d3e7256a3534d9dca75ff8ce9edf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d706c1463d2c913afe727db1a87a2841698d91a586c84d6b55f02865e7f5584a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.045338 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e7dd738-a9b3-455c-93e0-3f0dc7327817\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2crtc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8mn7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.059840 4823 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c915dd50-9820-494e-b47a-987257910a57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:55:39Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\"\\\\nI1216 06:55:39.731946 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 06:55:39.736431 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 06:55:39.736560 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 06:55:39.736611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 06:55:39.736722 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 06:55:39.741676 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 06:55:39.741701 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741707 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 06:55:39.741713 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 06:55:39.741717 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 06:55:39.741721 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:55:39.741725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:55:39.741774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:55:39.744309 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1853467785/tls.crt::/tmp/serving-cert-1853467785/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765868124\\\\\\\\\\\\\\\" (2025-12-16 06:55:23 +0000 UTC to 2026-01-15 06:55:24 +0000 UTC (now=2025-12-16 06:55:39.744275203 +0000 UTC))\\\\\\\"\\\\nF1216 06:55:39.744562 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:55:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:55:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:55:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:55:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:56:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.127068 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.127105 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.127115 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.127129 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.127140 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.228747 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.228782 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.228793 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.228810 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.228828 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.330977 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.331010 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.331019 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.331071 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.331082 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.433254 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.433299 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.433309 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.433333 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.433348 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.535951 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.535984 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.535994 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.536009 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.536018 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.638919 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.638966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.638977 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.638994 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.639005 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.740969 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.741018 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.741069 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.741088 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.741102 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.844110 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.844160 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.844171 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.844189 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.844201 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.946855 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.946920 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.946938 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.946966 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:42 crc kubenswrapper[4823]: I1216 06:56:42.946983 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:42Z","lastTransitionTime":"2025-12-16T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.050365 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.050447 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.050466 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.050498 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.050515 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.153843 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.153904 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.153920 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.153941 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.153954 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.256587 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.256648 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.256667 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.256692 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.256707 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.359579 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.359621 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.359630 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.359648 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.359659 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.462756 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.462818 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.462840 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.462868 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.462907 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.565815 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.565900 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.565944 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.565979 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.566001 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.579859 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.580451 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.580471 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.580486 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.580498 4823 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:56:43Z","lastTransitionTime":"2025-12-16T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.632488 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5"] Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.632979 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.634699 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.634812 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.635227 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.635404 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.671960 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.672109 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.672170 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672236 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:47.672212311 +0000 UTC m=+146.160778434 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672267 4823 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.672276 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672309 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:57:47.672299244 +0000 UTC m=+146.160865367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.672327 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672337 4823 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672363 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:57:47.672356446 +0000 UTC m=+146.160922569 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672415 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672416 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672431 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672428 4823 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672442 4823 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672450 4823 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672469 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:57:47.672462508 +0000 UTC m=+146.161028631 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.672480 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:57:47.672474759 +0000 UTC m=+146.161040882 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.676800 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hr8h5" podStartSLOduration=64.676785511 podStartE2EDuration="1m4.676785511s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.663563075 +0000 UTC m=+82.152129198" watchObservedRunningTime="2025-12-16 06:56:43.676785511 +0000 UTC m=+82.165351634" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.695470 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n248g" podStartSLOduration=64.695442908 podStartE2EDuration="1m4.695442908s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.676712219 +0000 UTC m=+82.165278352" watchObservedRunningTime="2025-12-16 06:56:43.695442908 +0000 UTC m=+82.184009071" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.708538 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-964hc" podStartSLOduration=64.7085192 podStartE2EDuration="1m4.7085192s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.695360895 +0000 UTC m=+82.183927028" watchObservedRunningTime="2025-12-16 06:56:43.7085192 +0000 UTC m=+82.197085323" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.708842 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.70883526 podStartE2EDuration="7.70883526s" podCreationTimestamp="2025-12-16 06:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.707651471 +0000 UTC m=+82.196217614" watchObservedRunningTime="2025-12-16 06:56:43.70883526 +0000 UTC m=+82.197401383" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.768150 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bwcng" podStartSLOduration=64.768126919 podStartE2EDuration="1m4.768126919s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.768105798 +0000 UTC m=+82.256671921" watchObservedRunningTime="2025-12-16 06:56:43.768126919 +0000 UTC m=+82.256693042" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.770622 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.770747 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.770763 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.770827 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.770913 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.771056 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.771254 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:43 crc kubenswrapper[4823]: E1216 06:56:43.771342 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.772985 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4aaf25b3-fc80-407e-a7d5-c173436753e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.773147 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf25b3-fc80-407e-a7d5-c173436753e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.773235 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4aaf25b3-fc80-407e-a7d5-c173436753e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.773365 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aaf25b3-fc80-407e-a7d5-c173436753e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.773410 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4aaf25b3-fc80-407e-a7d5-c173436753e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.786059 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=29.786039081 podStartE2EDuration="29.786039081s" podCreationTimestamp="2025-12-16 06:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.78573034 +0000 UTC m=+82.274296473" watchObservedRunningTime="2025-12-16 06:56:43.786039081 +0000 UTC m=+82.274605204" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.829606 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v5mgh" podStartSLOduration=63.829586699000004 podStartE2EDuration="1m3.829586699s" podCreationTimestamp="2025-12-16 06:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.82868541 +0000 UTC m=+82.317251533" watchObservedRunningTime="2025-12-16 06:56:43.829586699 +0000 UTC m=+82.318152822" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.852961 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=60.852940541 podStartE2EDuration="1m0.852940541s" podCreationTimestamp="2025-12-16 06:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.852702793 +0000 UTC m=+82.341268926" watchObservedRunningTime="2025-12-16 06:56:43.852940541 +0000 UTC m=+82.341506674" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.873799 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf25b3-fc80-407e-a7d5-c173436753e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.873839 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4aaf25b3-fc80-407e-a7d5-c173436753e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.873864 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aaf25b3-fc80-407e-a7d5-c173436753e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.873881 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4aaf25b3-fc80-407e-a7d5-c173436753e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.873921 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4aaf25b3-fc80-407e-a7d5-c173436753e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.873957 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4aaf25b3-fc80-407e-a7d5-c173436753e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.874268 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4aaf25b3-fc80-407e-a7d5-c173436753e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.876454 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4aaf25b3-fc80-407e-a7d5-c173436753e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.880998 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4aaf25b3-fc80-407e-a7d5-c173436753e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.895516 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4aaf25b3-fc80-407e-a7d5-c173436753e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lfpq5\" (UID: \"4aaf25b3-fc80-407e-a7d5-c173436753e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.916303 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podStartSLOduration=64.916282163 podStartE2EDuration="1m4.916282163s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.901190945 +0000 UTC m=+82.389757098" watchObservedRunningTime="2025-12-16 06:56:43.916282163 +0000 UTC m=+82.404848286" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.932232 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=64.93221038 podStartE2EDuration="1m4.93221038s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.916282183 +0000 UTC m=+82.404848316" watchObservedRunningTime="2025-12-16 06:56:43.93221038 +0000 UTC m=+82.420776503" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.932363 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.932359545 podStartE2EDuration="1m2.932359545s" podCreationTimestamp="2025-12-16 06:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:43.931865168 +0000 UTC m=+82.420431291" watchObservedRunningTime="2025-12-16 06:56:43.932359545 +0000 UTC m=+82.420925668" Dec 16 06:56:43 crc kubenswrapper[4823]: I1216 06:56:43.948496 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" Dec 16 06:56:43 crc kubenswrapper[4823]: W1216 06:56:43.963940 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aaf25b3_fc80_407e_a7d5_c173436753e5.slice/crio-33f8e760bf8aff909356ced73e519599e0c098e8b93a2560def5be94b95450b1 WatchSource:0}: Error finding container 33f8e760bf8aff909356ced73e519599e0c098e8b93a2560def5be94b95450b1: Status 404 returned error can't find the container with id 33f8e760bf8aff909356ced73e519599e0c098e8b93a2560def5be94b95450b1 Dec 16 06:56:44 crc kubenswrapper[4823]: I1216 06:56:44.294069 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" event={"ID":"4aaf25b3-fc80-407e-a7d5-c173436753e5","Type":"ContainerStarted","Data":"33f8e760bf8aff909356ced73e519599e0c098e8b93a2560def5be94b95450b1"} Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.298544 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" event={"ID":"4aaf25b3-fc80-407e-a7d5-c173436753e5","Type":"ContainerStarted","Data":"dfa2aa55e661b5503d9597cc149f539b0a67fe735ae6f472a9884dfc0957f71b"} Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.314152 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lfpq5" podStartSLOduration=66.314132641 podStartE2EDuration="1m6.314132641s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:45.312055162 +0000 UTC m=+83.800621285" watchObservedRunningTime="2025-12-16 06:56:45.314132641 +0000 UTC m=+83.802698764" Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.771568 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.771740 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:45 crc kubenswrapper[4823]: E1216 06:56:45.771914 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.771938 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.772509 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:45 crc kubenswrapper[4823]: E1216 06:56:45.772694 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:45 crc kubenswrapper[4823]: E1216 06:56:45.772707 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:45 crc kubenswrapper[4823]: E1216 06:56:45.772936 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:45 crc kubenswrapper[4823]: I1216 06:56:45.773168 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 06:56:45 crc kubenswrapper[4823]: E1216 06:56:45.773434 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:56:47 crc kubenswrapper[4823]: I1216 06:56:47.771464 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:47 crc kubenswrapper[4823]: I1216 06:56:47.771477 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:47 crc kubenswrapper[4823]: E1216 06:56:47.772101 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:47 crc kubenswrapper[4823]: I1216 06:56:47.771605 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:47 crc kubenswrapper[4823]: I1216 06:56:47.771496 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:47 crc kubenswrapper[4823]: E1216 06:56:47.772311 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:47 crc kubenswrapper[4823]: E1216 06:56:47.772453 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:47 crc kubenswrapper[4823]: E1216 06:56:47.772711 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:49 crc kubenswrapper[4823]: I1216 06:56:49.771561 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:49 crc kubenswrapper[4823]: I1216 06:56:49.771623 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:49 crc kubenswrapper[4823]: I1216 06:56:49.771771 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:49 crc kubenswrapper[4823]: E1216 06:56:49.771766 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:49 crc kubenswrapper[4823]: I1216 06:56:49.771812 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:49 crc kubenswrapper[4823]: E1216 06:56:49.771941 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:49 crc kubenswrapper[4823]: E1216 06:56:49.772064 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:49 crc kubenswrapper[4823]: E1216 06:56:49.772138 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:51 crc kubenswrapper[4823]: I1216 06:56:51.771262 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:51 crc kubenswrapper[4823]: I1216 06:56:51.772121 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:51 crc kubenswrapper[4823]: E1216 06:56:51.775218 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:51 crc kubenswrapper[4823]: I1216 06:56:51.775618 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:51 crc kubenswrapper[4823]: E1216 06:56:51.776482 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:51 crc kubenswrapper[4823]: E1216 06:56:51.776785 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:51 crc kubenswrapper[4823]: I1216 06:56:51.778419 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:51 crc kubenswrapper[4823]: E1216 06:56:51.778728 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:53 crc kubenswrapper[4823]: I1216 06:56:53.770713 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:53 crc kubenswrapper[4823]: I1216 06:56:53.770833 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:53 crc kubenswrapper[4823]: I1216 06:56:53.770733 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:53 crc kubenswrapper[4823]: E1216 06:56:53.770994 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:53 crc kubenswrapper[4823]: I1216 06:56:53.770849 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:53 crc kubenswrapper[4823]: E1216 06:56:53.770904 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:53 crc kubenswrapper[4823]: E1216 06:56:53.771739 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:53 crc kubenswrapper[4823]: E1216 06:56:53.771154 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:55 crc kubenswrapper[4823]: I1216 06:56:55.771556 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:55 crc kubenswrapper[4823]: I1216 06:56:55.771683 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:55 crc kubenswrapper[4823]: E1216 06:56:55.771807 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:55 crc kubenswrapper[4823]: I1216 06:56:55.771845 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:55 crc kubenswrapper[4823]: I1216 06:56:55.771888 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:55 crc kubenswrapper[4823]: E1216 06:56:55.772144 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:55 crc kubenswrapper[4823]: E1216 06:56:55.772313 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:55 crc kubenswrapper[4823]: E1216 06:56:55.772317 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:57 crc kubenswrapper[4823]: I1216 06:56:57.648304 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:57 crc kubenswrapper[4823]: E1216 06:56:57.648618 4823 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:57 crc kubenswrapper[4823]: E1216 06:56:57.648733 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs podName:1e7dd738-a9b3-455c-93e0-3f0dc7327817 nodeName:}" failed. No retries permitted until 2025-12-16 06:58:01.648700694 +0000 UTC m=+160.137266857 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs") pod "network-metrics-daemon-8mn7l" (UID: "1e7dd738-a9b3-455c-93e0-3f0dc7327817") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:56:57 crc kubenswrapper[4823]: I1216 06:56:57.771153 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:57 crc kubenswrapper[4823]: I1216 06:56:57.771258 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:57 crc kubenswrapper[4823]: I1216 06:56:57.771169 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:57 crc kubenswrapper[4823]: E1216 06:56:57.771376 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:57 crc kubenswrapper[4823]: E1216 06:56:57.771482 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:57 crc kubenswrapper[4823]: I1216 06:56:57.771589 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:57 crc kubenswrapper[4823]: E1216 06:56:57.771623 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:56:57 crc kubenswrapper[4823]: E1216 06:56:57.771692 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:59 crc kubenswrapper[4823]: I1216 06:56:59.771348 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:56:59 crc kubenswrapper[4823]: I1216 06:56:59.771447 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:56:59 crc kubenswrapper[4823]: I1216 06:56:59.771376 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:56:59 crc kubenswrapper[4823]: E1216 06:56:59.771631 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:56:59 crc kubenswrapper[4823]: I1216 06:56:59.771674 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:56:59 crc kubenswrapper[4823]: E1216 06:56:59.771564 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:56:59 crc kubenswrapper[4823]: E1216 06:56:59.771875 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:56:59 crc kubenswrapper[4823]: E1216 06:56:59.771937 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:00 crc kubenswrapper[4823]: I1216 06:57:00.771714 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 06:57:00 crc kubenswrapper[4823]: E1216 06:57:00.771928 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:57:01 crc kubenswrapper[4823]: I1216 06:57:01.772389 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:01 crc kubenswrapper[4823]: I1216 06:57:01.772511 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:01 crc kubenswrapper[4823]: I1216 06:57:01.772709 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:01 crc kubenswrapper[4823]: I1216 06:57:01.772716 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:01 crc kubenswrapper[4823]: E1216 06:57:01.773802 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:01 crc kubenswrapper[4823]: E1216 06:57:01.773864 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:01 crc kubenswrapper[4823]: E1216 06:57:01.773977 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:01 crc kubenswrapper[4823]: E1216 06:57:01.773980 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:03 crc kubenswrapper[4823]: I1216 06:57:03.771606 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:03 crc kubenswrapper[4823]: I1216 06:57:03.771664 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:03 crc kubenswrapper[4823]: E1216 06:57:03.771816 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:03 crc kubenswrapper[4823]: I1216 06:57:03.771894 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:03 crc kubenswrapper[4823]: I1216 06:57:03.771940 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:03 crc kubenswrapper[4823]: E1216 06:57:03.771954 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:03 crc kubenswrapper[4823]: E1216 06:57:03.772093 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:03 crc kubenswrapper[4823]: E1216 06:57:03.772111 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:05 crc kubenswrapper[4823]: I1216 06:57:05.771387 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:05 crc kubenswrapper[4823]: I1216 06:57:05.771446 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:05 crc kubenswrapper[4823]: I1216 06:57:05.771496 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:05 crc kubenswrapper[4823]: I1216 06:57:05.771384 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:05 crc kubenswrapper[4823]: E1216 06:57:05.771799 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:05 crc kubenswrapper[4823]: E1216 06:57:05.771727 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:05 crc kubenswrapper[4823]: E1216 06:57:05.771884 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:05 crc kubenswrapper[4823]: E1216 06:57:05.771975 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:07 crc kubenswrapper[4823]: I1216 06:57:07.771041 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:07 crc kubenswrapper[4823]: I1216 06:57:07.771072 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:07 crc kubenswrapper[4823]: I1216 06:57:07.771166 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:07 crc kubenswrapper[4823]: E1216 06:57:07.771384 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:07 crc kubenswrapper[4823]: I1216 06:57:07.771458 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:07 crc kubenswrapper[4823]: E1216 06:57:07.771730 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:07 crc kubenswrapper[4823]: E1216 06:57:07.772215 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:07 crc kubenswrapper[4823]: E1216 06:57:07.772550 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:09 crc kubenswrapper[4823]: I1216 06:57:09.771438 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:09 crc kubenswrapper[4823]: I1216 06:57:09.771483 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:09 crc kubenswrapper[4823]: I1216 06:57:09.771640 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:09 crc kubenswrapper[4823]: I1216 06:57:09.771601 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:09 crc kubenswrapper[4823]: E1216 06:57:09.771718 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:09 crc kubenswrapper[4823]: E1216 06:57:09.771824 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:09 crc kubenswrapper[4823]: E1216 06:57:09.771934 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:09 crc kubenswrapper[4823]: E1216 06:57:09.772330 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:11 crc kubenswrapper[4823]: I1216 06:57:11.770777 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:11 crc kubenswrapper[4823]: I1216 06:57:11.770831 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:11 crc kubenswrapper[4823]: I1216 06:57:11.770876 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:11 crc kubenswrapper[4823]: E1216 06:57:11.772529 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:11 crc kubenswrapper[4823]: I1216 06:57:11.772583 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:11 crc kubenswrapper[4823]: E1216 06:57:11.772995 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:11 crc kubenswrapper[4823]: E1216 06:57:11.773105 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:11 crc kubenswrapper[4823]: E1216 06:57:11.773234 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:11 crc kubenswrapper[4823]: I1216 06:57:11.773470 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 06:57:11 crc kubenswrapper[4823]: E1216 06:57:11.773717 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwjhk_openshift-ovn-kubernetes(08e48f89-7095-4ea2-afb5-759591c2b0d4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" Dec 16 06:57:13 crc kubenswrapper[4823]: I1216 06:57:13.771429 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:13 crc kubenswrapper[4823]: I1216 06:57:13.771620 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:13 crc kubenswrapper[4823]: E1216 06:57:13.771650 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:13 crc kubenswrapper[4823]: I1216 06:57:13.771813 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:13 crc kubenswrapper[4823]: I1216 06:57:13.771837 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:13 crc kubenswrapper[4823]: E1216 06:57:13.771956 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:13 crc kubenswrapper[4823]: E1216 06:57:13.772262 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:13 crc kubenswrapper[4823]: E1216 06:57:13.772296 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.411365 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/1.log" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.411851 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/0.log" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.411894 4823 generic.go:334] "Generic (PLEG): container finished" podID="1b377757-dbc6-4d9c-9656-3ff65d7d113a" containerID="93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966" exitCode=1 Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.411927 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerDied","Data":"93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966"} Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.411969 4823 scope.go:117] "RemoveContainer" containerID="78e6c48a7009b10b70f11bfc96f18839781aeccd794f9ce7a4074271cfb9becc" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.412293 4823 scope.go:117] "RemoveContainer" containerID="93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966" Dec 16 06:57:15 crc kubenswrapper[4823]: E1216 06:57:15.412454 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n248g_openshift-multus(1b377757-dbc6-4d9c-9656-3ff65d7d113a)\"" pod="openshift-multus/multus-n248g" podUID="1b377757-dbc6-4d9c-9656-3ff65d7d113a" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.771391 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.771401 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.771425 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:15 crc kubenswrapper[4823]: E1216 06:57:15.772247 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:15 crc kubenswrapper[4823]: I1216 06:57:15.771471 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:15 crc kubenswrapper[4823]: E1216 06:57:15.772318 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:15 crc kubenswrapper[4823]: E1216 06:57:15.772363 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:15 crc kubenswrapper[4823]: E1216 06:57:15.772005 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:16 crc kubenswrapper[4823]: I1216 06:57:16.417735 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/1.log" Dec 16 06:57:17 crc kubenswrapper[4823]: I1216 06:57:17.771258 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:17 crc kubenswrapper[4823]: I1216 06:57:17.771380 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:17 crc kubenswrapper[4823]: E1216 06:57:17.771498 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:17 crc kubenswrapper[4823]: I1216 06:57:17.771582 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:17 crc kubenswrapper[4823]: I1216 06:57:17.771582 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:17 crc kubenswrapper[4823]: E1216 06:57:17.771847 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:17 crc kubenswrapper[4823]: E1216 06:57:17.771947 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:17 crc kubenswrapper[4823]: E1216 06:57:17.771995 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:19 crc kubenswrapper[4823]: I1216 06:57:19.770781 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:19 crc kubenswrapper[4823]: I1216 06:57:19.770873 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:19 crc kubenswrapper[4823]: E1216 06:57:19.771119 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:19 crc kubenswrapper[4823]: I1216 06:57:19.771161 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:19 crc kubenswrapper[4823]: E1216 06:57:19.771342 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:19 crc kubenswrapper[4823]: E1216 06:57:19.771478 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:19 crc kubenswrapper[4823]: I1216 06:57:19.772168 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:19 crc kubenswrapper[4823]: E1216 06:57:19.772384 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:21 crc kubenswrapper[4823]: E1216 06:57:21.739780 4823 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 16 06:57:21 crc kubenswrapper[4823]: I1216 06:57:21.770626 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:21 crc kubenswrapper[4823]: I1216 06:57:21.770626 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:21 crc kubenswrapper[4823]: I1216 06:57:21.770667 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:21 crc kubenswrapper[4823]: I1216 06:57:21.770855 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:21 crc kubenswrapper[4823]: E1216 06:57:21.772332 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:21 crc kubenswrapper[4823]: E1216 06:57:21.772394 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:21 crc kubenswrapper[4823]: E1216 06:57:21.772460 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:21 crc kubenswrapper[4823]: E1216 06:57:21.772589 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:21 crc kubenswrapper[4823]: E1216 06:57:21.889867 4823 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 06:57:23 crc kubenswrapper[4823]: I1216 06:57:23.770992 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:23 crc kubenswrapper[4823]: I1216 06:57:23.771069 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:23 crc kubenswrapper[4823]: I1216 06:57:23.771069 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:23 crc kubenswrapper[4823]: E1216 06:57:23.771189 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:23 crc kubenswrapper[4823]: I1216 06:57:23.771266 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:23 crc kubenswrapper[4823]: E1216 06:57:23.771275 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:23 crc kubenswrapper[4823]: E1216 06:57:23.771351 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:23 crc kubenswrapper[4823]: E1216 06:57:23.771437 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:25 crc kubenswrapper[4823]: I1216 06:57:25.771669 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:25 crc kubenswrapper[4823]: I1216 06:57:25.771738 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:25 crc kubenswrapper[4823]: I1216 06:57:25.771821 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:25 crc kubenswrapper[4823]: E1216 06:57:25.771924 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:25 crc kubenswrapper[4823]: E1216 06:57:25.772007 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:25 crc kubenswrapper[4823]: E1216 06:57:25.772588 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:25 crc kubenswrapper[4823]: I1216 06:57:25.772850 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 06:57:25 crc kubenswrapper[4823]: I1216 06:57:25.773435 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:25 crc kubenswrapper[4823]: E1216 06:57:25.774291 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:26 crc kubenswrapper[4823]: I1216 06:57:26.460902 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/3.log" Dec 16 06:57:26 crc kubenswrapper[4823]: I1216 06:57:26.464625 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerStarted","Data":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} Dec 16 06:57:26 crc kubenswrapper[4823]: I1216 06:57:26.465577 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:57:26 crc kubenswrapper[4823]: I1216 06:57:26.501918 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podStartSLOduration=107.501892999 podStartE2EDuration="1m47.501892999s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:26.500245535 +0000 UTC m=+124.988811668" watchObservedRunningTime="2025-12-16 06:57:26.501892999 +0000 UTC m=+124.990459122" Dec 16 06:57:26 crc kubenswrapper[4823]: I1216 06:57:26.819461 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8mn7l"] Dec 16 06:57:26 crc kubenswrapper[4823]: I1216 06:57:26.819595 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:26 crc kubenswrapper[4823]: E1216 06:57:26.819714 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:26 crc kubenswrapper[4823]: E1216 06:57:26.891681 4823 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 06:57:27 crc kubenswrapper[4823]: I1216 06:57:27.771538 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:27 crc kubenswrapper[4823]: I1216 06:57:27.771583 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:27 crc kubenswrapper[4823]: I1216 06:57:27.771880 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:27 crc kubenswrapper[4823]: I1216 06:57:27.771967 4823 scope.go:117] "RemoveContainer" containerID="93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966" Dec 16 06:57:27 crc kubenswrapper[4823]: E1216 06:57:27.772124 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:27 crc kubenswrapper[4823]: E1216 06:57:27.772389 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:27 crc kubenswrapper[4823]: E1216 06:57:27.772571 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:28 crc kubenswrapper[4823]: I1216 06:57:28.476134 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/1.log" Dec 16 06:57:28 crc kubenswrapper[4823]: I1216 06:57:28.476725 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerStarted","Data":"90088e0c301e42cba0bff78d22324f5a77b817c3f63e352985dd26abb4706970"} Dec 16 06:57:28 crc kubenswrapper[4823]: I1216 06:57:28.770630 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:28 crc kubenswrapper[4823]: E1216 06:57:28.770861 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:29 crc kubenswrapper[4823]: I1216 06:57:29.771865 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:29 crc kubenswrapper[4823]: I1216 06:57:29.771960 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:29 crc kubenswrapper[4823]: I1216 06:57:29.771865 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:29 crc kubenswrapper[4823]: E1216 06:57:29.772166 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:29 crc kubenswrapper[4823]: E1216 06:57:29.772334 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:29 crc kubenswrapper[4823]: E1216 06:57:29.772515 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:30 crc kubenswrapper[4823]: I1216 06:57:30.771352 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:30 crc kubenswrapper[4823]: E1216 06:57:30.771627 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8mn7l" podUID="1e7dd738-a9b3-455c-93e0-3f0dc7327817" Dec 16 06:57:31 crc kubenswrapper[4823]: I1216 06:57:31.770763 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:31 crc kubenswrapper[4823]: I1216 06:57:31.770813 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:31 crc kubenswrapper[4823]: E1216 06:57:31.772098 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:57:31 crc kubenswrapper[4823]: I1216 06:57:31.772287 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:31 crc kubenswrapper[4823]: E1216 06:57:31.772446 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:57:31 crc kubenswrapper[4823]: E1216 06:57:31.772593 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:57:32 crc kubenswrapper[4823]: I1216 06:57:32.771211 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:57:32 crc kubenswrapper[4823]: I1216 06:57:32.774637 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 06:57:32 crc kubenswrapper[4823]: I1216 06:57:32.774638 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.771331 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.771383 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.771792 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.774114 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.774209 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.774216 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 06:57:33 crc kubenswrapper[4823]: I1216 06:57:33.774690 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.148172 4823 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.210073 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vmqj6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.221690 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.222065 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bh4xp"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.222377 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.222623 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s6mwx"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.222684 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.223153 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.223430 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.223894 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.224573 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.224902 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.225212 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.225507 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.225776 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.247826 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.247931 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248187 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248277 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248341 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248399 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248540 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248660 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248683 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248795 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248903 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.248964 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249218 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249309 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249380 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249439 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249523 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249668 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249885 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.249223 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.250147 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.250296 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.250471 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.250936 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.251189 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.251311 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.252215 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.255923 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.256426 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.266265 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.266738 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.266955 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.267078 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.267181 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.267318 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.267473 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.267659 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.267900 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.271867 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272277 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272305 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272461 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272481 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272585 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272655 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.272716 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.276370 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.277718 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.278381 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.284411 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ss5zz"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.288574 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t7pwj"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.289094 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.289513 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.291661 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.294963 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.295786 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.296242 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297183 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297248 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297526 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297681 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297791 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297936 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.297984 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.298090 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.298458 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.298647 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.298657 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.298826 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.298835 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.299705 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-k2ljf"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.299812 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.300520 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.300777 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.301230 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.301827 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.305355 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.307302 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313316 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.314064 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313696 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313729 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.314577 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313748 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313790 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313805 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.313901 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.314453 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315317 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315374 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315436 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315570 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315638 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315702 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315754 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315773 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315799 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.315863 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.320171 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.320886 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.321116 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.344343 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.346406 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jsk55"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.350223 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.350673 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.366950 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-config\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367006 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea4f36-2ae5-4363-a65c-0b7346f02661-serving-cert\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367094 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-service-ca-bundle\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367132 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-client-ca\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367162 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-config\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367206 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdw6v\" (UniqueName: \"kubernetes.io/projected/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-kube-api-access-qdw6v\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367242 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs7zv\" (UniqueName: \"kubernetes.io/projected/142b25e7-9ad6-4a22-8f1c-8bd280329db9-kube-api-access-rs7zv\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367269 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxv6\" (UniqueName: \"kubernetes.io/projected/593f793a-bb15-4f83-8454-e3a1ced41667-kube-api-access-5dxv6\") pod \"cluster-samples-operator-665b6dd947-dpsl6\" (UID: \"593f793a-bb15-4f83-8454-e3a1ced41667\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367292 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d29bdc9-59b6-460e-a725-2f731de32ec3-serving-cert\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367319 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpf95\" (UniqueName: \"kubernetes.io/projected/6d29bdc9-59b6-460e-a725-2f731de32ec3-kube-api-access-fpf95\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367342 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/142b25e7-9ad6-4a22-8f1c-8bd280329db9-auth-proxy-config\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367363 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-encryption-config\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367403 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-audit\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367426 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-image-import-ca\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367444 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-config\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367464 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367485 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-etcd-client\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367510 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7hcz\" (UniqueName: \"kubernetes.io/projected/dfc57533-6490-47c8-8188-ad895f04811c-kube-api-access-k7hcz\") pod \"migrator-59844c95c7-jjrb4\" (UID: \"dfc57533-6490-47c8-8188-ad895f04811c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367541 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-etcd-client\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367578 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/410cd31d-8e32-4101-b933-2c7bd673c17f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367596 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5v27\" (UniqueName: \"kubernetes.io/projected/410cd31d-8e32-4101-b933-2c7bd673c17f-kube-api-access-b5v27\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367619 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-client-ca\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367641 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcbs\" (UniqueName: \"kubernetes.io/projected/3ec12da7-6ed9-4798-ba75-1b0c160dd126-kube-api-access-xfcbs\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367661 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-audit-policies\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367682 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22nz8\" (UniqueName: \"kubernetes.io/projected/ca0a37ba-9a04-4d90-8ee8-6797791303a4-kube-api-access-22nz8\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367698 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142b25e7-9ad6-4a22-8f1c-8bd280329db9-config\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367718 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgdjh\" (UniqueName: \"kubernetes.io/projected/b76da243-83e6-4503-be17-ef252bff5a98-kube-api-access-cgdjh\") pod \"dns-operator-744455d44c-t7pwj\" (UID: \"b76da243-83e6-4503-be17-ef252bff5a98\") " pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367736 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a37ba-9a04-4d90-8ee8-6797791303a4-audit-dir\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367754 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367771 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-serving-cert\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367792 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-serving-cert\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367822 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-encryption-config\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367855 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410cd31d-8e32-4101-b933-2c7bd673c17f-proxy-tls\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367872 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b76da243-83e6-4503-be17-ef252bff5a98-metrics-tls\") pod \"dns-operator-744455d44c-t7pwj\" (UID: \"b76da243-83e6-4503-be17-ef252bff5a98\") " pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367888 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec12da7-6ed9-4798-ba75-1b0c160dd126-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367906 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/142b25e7-9ad6-4a22-8f1c-8bd280329db9-machine-approver-tls\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367923 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367941 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8aaa63b4-9b41-442f-b9ea-672885a486bd-serving-cert\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367966 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbws\" (UniqueName: \"kubernetes.io/projected/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-kube-api-access-7gbws\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.367983 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368020 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-images\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368139 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-config\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368163 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-audit-dir\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368182 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368204 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-config\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368224 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/410cd31d-8e32-4101-b933-2c7bd673c17f-images\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368256 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec12da7-6ed9-4798-ba75-1b0c160dd126-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368287 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/593f793a-bb15-4f83-8454-e3a1ced41667-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dpsl6\" (UID: \"593f793a-bb15-4f83-8454-e3a1ced41667\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368318 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ws45\" (UniqueName: \"kubernetes.io/projected/2dea4f36-2ae5-4363-a65c-0b7346f02661-kube-api-access-4ws45\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368343 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-node-pullsecrets\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368364 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9w6b\" (UniqueName: \"kubernetes.io/projected/8aaa63b4-9b41-442f-b9ea-672885a486bd-kube-api-access-h9w6b\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368392 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.368415 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.389457 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.389743 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s6mwx"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.389762 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bh4xp"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.389773 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmx2d"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.390053 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.400127 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.400495 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.400711 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.400975 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.401208 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.401848 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.402234 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.402797 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.402889 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403034 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403156 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403178 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403277 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403669 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403948 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.403951 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.404167 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.404855 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.406557 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.409325 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bx552"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.410175 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.412736 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.413301 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.413353 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9lh5d"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.414986 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.415553 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.415672 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.416399 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.417108 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.420822 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rmzm5"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.421477 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.422493 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mgbxj"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.427493 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vnw5c"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.428437 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.428886 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lpkx6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.429464 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.429549 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.429791 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.429819 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.430150 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.430233 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.434592 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4wgv6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.435132 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6xfbm"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.435217 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.436169 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-thj57"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.436245 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.437137 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.437721 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.437970 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.438554 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.439206 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.439322 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.445532 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.445758 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.445885 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.446016 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.446253 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.446291 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.446585 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.459643 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.460212 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.460239 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.460995 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vmqj6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461016 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461045 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461056 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461068 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jsk55"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461077 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461048 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.461346 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.468914 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-k2ljf"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469717 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljnjf\" (UniqueName: \"kubernetes.io/projected/37c33a89-18c7-457a-a8ce-85c7721719fc-kube-api-access-ljnjf\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469781 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a66a086-81cf-498e-aada-d06b41019b1f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469812 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-config\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469843 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-client-ca\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469863 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdw6v\" (UniqueName: \"kubernetes.io/projected/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-kube-api-access-qdw6v\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469879 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs7zv\" (UniqueName: \"kubernetes.io/projected/142b25e7-9ad6-4a22-8f1c-8bd280329db9-kube-api-access-rs7zv\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469896 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxv6\" (UniqueName: \"kubernetes.io/projected/593f793a-bb15-4f83-8454-e3a1ced41667-kube-api-access-5dxv6\") pod \"cluster-samples-operator-665b6dd947-dpsl6\" (UID: \"593f793a-bb15-4f83-8454-e3a1ced41667\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469914 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d29bdc9-59b6-460e-a725-2f731de32ec3-serving-cert\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469935 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpf95\" (UniqueName: \"kubernetes.io/projected/6d29bdc9-59b6-460e-a725-2f731de32ec3-kube-api-access-fpf95\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469952 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a66a086-81cf-498e-aada-d06b41019b1f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469975 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs46m\" (UniqueName: \"kubernetes.io/projected/979cabf2-8f74-4c88-92e8-baaffd74d816-kube-api-access-xs46m\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.469993 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-audit\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470011 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-image-import-ca\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470044 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/37c33a89-18c7-457a-a8ce-85c7721719fc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470069 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/142b25e7-9ad6-4a22-8f1c-8bd280329db9-auth-proxy-config\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470087 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-encryption-config\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470111 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37c33a89-18c7-457a-a8ce-85c7721719fc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470131 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjv2\" (UniqueName: \"kubernetes.io/projected/dc6756f9-a794-4667-8143-bd14bedd0cc3-kube-api-access-9mjv2\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470160 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed1826b1-19af-4d50-b293-5c22c03fbfe7-config\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470185 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-config\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470211 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrsc\" (UniqueName: \"kubernetes.io/projected/9a66a086-81cf-498e-aada-d06b41019b1f-kube-api-access-wbrsc\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470233 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed1826b1-19af-4d50-b293-5c22c03fbfe7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470262 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7hcz\" (UniqueName: \"kubernetes.io/projected/dfc57533-6490-47c8-8188-ad895f04811c-kube-api-access-k7hcz\") pod \"migrator-59844c95c7-jjrb4\" (UID: \"dfc57533-6490-47c8-8188-ad895f04811c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470281 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8428636f-bcf3-4698-b121-1649ff94810f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5dbcj\" (UID: \"8428636f-bcf3-4698-b121-1649ff94810f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470310 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470327 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-etcd-client\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470347 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-etcd-client\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470365 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcbs\" (UniqueName: \"kubernetes.io/projected/3ec12da7-6ed9-4798-ba75-1b0c160dd126-kube-api-access-xfcbs\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470385 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/410cd31d-8e32-4101-b933-2c7bd673c17f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470403 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5v27\" (UniqueName: \"kubernetes.io/projected/410cd31d-8e32-4101-b933-2c7bd673c17f-kube-api-access-b5v27\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470427 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-client-ca\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470456 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-audit-policies\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470478 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22nz8\" (UniqueName: \"kubernetes.io/projected/ca0a37ba-9a04-4d90-8ee8-6797791303a4-kube-api-access-22nz8\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470495 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6756f9-a794-4667-8143-bd14bedd0cc3-proxy-tls\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470516 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470536 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142b25e7-9ad6-4a22-8f1c-8bd280329db9-config\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470553 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgdjh\" (UniqueName: \"kubernetes.io/projected/b76da243-83e6-4503-be17-ef252bff5a98-kube-api-access-cgdjh\") pod \"dns-operator-744455d44c-t7pwj\" (UID: \"b76da243-83e6-4503-be17-ef252bff5a98\") " pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470572 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a37ba-9a04-4d90-8ee8-6797791303a4-audit-dir\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470589 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-serving-cert\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470604 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-serving-cert\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470623 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc6756f9-a794-4667-8143-bd14bedd0cc3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470652 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-encryption-config\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470670 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7hjh\" (UniqueName: \"kubernetes.io/projected/8428636f-bcf3-4698-b121-1649ff94810f-kube-api-access-h7hjh\") pod \"package-server-manager-789f6589d5-5dbcj\" (UID: \"8428636f-bcf3-4698-b121-1649ff94810f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470686 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410cd31d-8e32-4101-b933-2c7bd673c17f-proxy-tls\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470703 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b76da243-83e6-4503-be17-ef252bff5a98-metrics-tls\") pod \"dns-operator-744455d44c-t7pwj\" (UID: \"b76da243-83e6-4503-be17-ef252bff5a98\") " pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470720 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec12da7-6ed9-4798-ba75-1b0c160dd126-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470746 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/142b25e7-9ad6-4a22-8f1c-8bd280329db9-machine-approver-tls\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470763 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470780 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37c33a89-18c7-457a-a8ce-85c7721719fc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470799 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8aaa63b4-9b41-442f-b9ea-672885a486bd-serving-cert\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.470825 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473604 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbws\" (UniqueName: \"kubernetes.io/projected/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-kube-api-access-7gbws\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473659 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/979cabf2-8f74-4c88-92e8-baaffd74d816-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473688 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctp67\" (UniqueName: \"kubernetes.io/projected/a4d1fadc-5068-4443-8ba2-9bbd80233db2-kube-api-access-ctp67\") pod \"downloads-7954f5f757-k2ljf\" (UID: \"a4d1fadc-5068-4443-8ba2-9bbd80233db2\") " pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473715 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-images\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473740 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-config\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473954 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-audit-dir\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473975 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.473998 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-config\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474034 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/410cd31d-8e32-4101-b933-2c7bd673c17f-images\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474246 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec12da7-6ed9-4798-ba75-1b0c160dd126-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474274 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/979cabf2-8f74-4c88-92e8-baaffd74d816-serving-cert\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474301 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/593f793a-bb15-4f83-8454-e3a1ced41667-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dpsl6\" (UID: \"593f793a-bb15-4f83-8454-e3a1ced41667\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474323 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ws45\" (UniqueName: \"kubernetes.io/projected/2dea4f36-2ae5-4363-a65c-0b7346f02661-kube-api-access-4ws45\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474346 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-node-pullsecrets\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474533 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9w6b\" (UniqueName: \"kubernetes.io/projected/8aaa63b4-9b41-442f-b9ea-672885a486bd-kube-api-access-h9w6b\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474552 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed1826b1-19af-4d50-b293-5c22c03fbfe7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474575 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474598 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474743 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-config\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474764 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea4f36-2ae5-4363-a65c-0b7346f02661-serving-cert\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.474784 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-service-ca-bundle\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.479841 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-config\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.479960 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-client-ca\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.480677 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.486106 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d29bdc9-59b6-460e-a725-2f731de32ec3-serving-cert\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.486646 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.490015 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-encryption-config\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.494355 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-audit\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.495325 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-image-import-ca\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.496004 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/142b25e7-9ad6-4a22-8f1c-8bd280329db9-auth-proxy-config\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.496322 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmx2d"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.499713 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vnw5c"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.501564 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-encryption-config\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.503232 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-config\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.504285 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.505787 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-client-ca\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.507863 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.509248 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142b25e7-9ad6-4a22-8f1c-8bd280329db9-config\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.509344 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.509862 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-audit-policies\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.513812 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a37ba-9a04-4d90-8ee8-6797791303a4-audit-dir\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.513955 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-audit-dir\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.513663 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-etcd-client\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.514662 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.514746 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.514984 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-config\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.515439 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.515490 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-config\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.515501 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.517982 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.521876 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/410cd31d-8e32-4101-b933-2c7bd673c17f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.522180 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.523301 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-node-pullsecrets\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.523957 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ss5zz"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.524065 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.524098 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/410cd31d-8e32-4101-b933-2c7bd673c17f-images\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.525066 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca0a37ba-9a04-4d90-8ee8-6797791303a4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.525096 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-config\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.525590 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec12da7-6ed9-4798-ba75-1b0c160dd126-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.525806 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-images\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.528573 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0a37ba-9a04-4d90-8ee8-6797791303a4-serving-cert\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.528757 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d29bdc9-59b6-460e-a725-2f731de32ec3-service-ca-bundle\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.529116 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-serving-cert\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.529206 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-etcd-client\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.529859 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.530276 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec12da7-6ed9-4798-ba75-1b0c160dd126-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.530448 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.532658 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.532795 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/593f793a-bb15-4f83-8454-e3a1ced41667-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dpsl6\" (UID: \"593f793a-bb15-4f83-8454-e3a1ced41667\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.534089 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.534651 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b76da243-83e6-4503-be17-ef252bff5a98-metrics-tls\") pod \"dns-operator-744455d44c-t7pwj\" (UID: \"b76da243-83e6-4503-be17-ef252bff5a98\") " pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.535364 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/142b25e7-9ad6-4a22-8f1c-8bd280329db9-machine-approver-tls\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.545511 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.547201 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/410cd31d-8e32-4101-b933-2c7bd673c17f-proxy-tls\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.534982 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8aaa63b4-9b41-442f-b9ea-672885a486bd-serving-cert\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.547281 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.548520 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-thj57"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.548987 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.549535 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea4f36-2ae5-4363-a65c-0b7346f02661-serving-cert\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.549664 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6bwk5"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.550661 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.551248 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c69wg"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.552573 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.552617 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4wgv6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.553979 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9lh5d"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.554821 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.556128 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.557317 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.558334 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mgbxj"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.560150 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.561887 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rmzm5"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.564141 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.564203 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6xfbm"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.565689 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.566229 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t7pwj"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.567636 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.568983 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c69wg"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.570158 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.571151 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bx552"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.572396 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.573504 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4bxzg"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.574803 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4bxzg"] Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.575056 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576096 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6756f9-a794-4667-8143-bd14bedd0cc3-proxy-tls\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576194 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc6756f9-a794-4667-8143-bd14bedd0cc3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576230 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7hjh\" (UniqueName: \"kubernetes.io/projected/8428636f-bcf3-4698-b121-1649ff94810f-kube-api-access-h7hjh\") pod \"package-server-manager-789f6589d5-5dbcj\" (UID: \"8428636f-bcf3-4698-b121-1649ff94810f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576266 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37c33a89-18c7-457a-a8ce-85c7721719fc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576310 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctp67\" (UniqueName: \"kubernetes.io/projected/a4d1fadc-5068-4443-8ba2-9bbd80233db2-kube-api-access-ctp67\") pod \"downloads-7954f5f757-k2ljf\" (UID: \"a4d1fadc-5068-4443-8ba2-9bbd80233db2\") " pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576333 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/979cabf2-8f74-4c88-92e8-baaffd74d816-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.576353 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/979cabf2-8f74-4c88-92e8-baaffd74d816-serving-cert\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577230 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/979cabf2-8f74-4c88-92e8-baaffd74d816-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577409 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed1826b1-19af-4d50-b293-5c22c03fbfe7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljnjf\" (UniqueName: \"kubernetes.io/projected/37c33a89-18c7-457a-a8ce-85c7721719fc-kube-api-access-ljnjf\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577476 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a66a086-81cf-498e-aada-d06b41019b1f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577531 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a66a086-81cf-498e-aada-d06b41019b1f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577849 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc6756f9-a794-4667-8143-bd14bedd0cc3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.578393 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a66a086-81cf-498e-aada-d06b41019b1f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.579889 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37c33a89-18c7-457a-a8ce-85c7721719fc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.577552 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs46m\" (UniqueName: \"kubernetes.io/projected/979cabf2-8f74-4c88-92e8-baaffd74d816-kube-api-access-xs46m\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.579972 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/37c33a89-18c7-457a-a8ce-85c7721719fc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580000 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37c33a89-18c7-457a-a8ce-85c7721719fc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580073 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjv2\" (UniqueName: \"kubernetes.io/projected/dc6756f9-a794-4667-8143-bd14bedd0cc3-kube-api-access-9mjv2\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580104 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbrsc\" (UniqueName: \"kubernetes.io/projected/9a66a086-81cf-498e-aada-d06b41019b1f-kube-api-access-wbrsc\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580129 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed1826b1-19af-4d50-b293-5c22c03fbfe7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580155 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed1826b1-19af-4d50-b293-5c22c03fbfe7-config\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580178 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8428636f-bcf3-4698-b121-1649ff94810f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5dbcj\" (UID: \"8428636f-bcf3-4698-b121-1649ff94810f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580556 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/979cabf2-8f74-4c88-92e8-baaffd74d816-serving-cert\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.580854 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc6756f9-a794-4667-8143-bd14bedd0cc3-proxy-tls\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.581410 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a66a086-81cf-498e-aada-d06b41019b1f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.581776 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed1826b1-19af-4d50-b293-5c22c03fbfe7-config\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.583710 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed1826b1-19af-4d50-b293-5c22c03fbfe7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.583725 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8428636f-bcf3-4698-b121-1649ff94810f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5dbcj\" (UID: \"8428636f-bcf3-4698-b121-1649ff94810f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.585006 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/37c33a89-18c7-457a-a8ce-85c7721719fc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.586408 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.606131 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.633681 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.646873 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.666953 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.685882 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.704935 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.726339 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.745756 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.765193 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.785092 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.806013 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.826613 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.853837 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.866047 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.886776 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.905262 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.926617 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.947776 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.965881 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 06:57:34 crc kubenswrapper[4823]: I1216 06:57:34.985800 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.005846 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.024932 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.045795 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.066153 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.086055 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.105358 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.125722 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.146201 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.166911 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.186272 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.207954 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.226249 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.247111 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.266180 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.287507 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.314396 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.325966 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.346678 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.365894 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.387236 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.405464 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.427225 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.444137 4823 request.go:700] Waited for 1.007550035s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/configmaps?fieldSelector=metadata.name%3Dv4-0-config-system-service-ca&limit=500&resourceVersion=0 Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.446557 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.527700 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.528034 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.537695 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.540777 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.575272 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.586706 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.607304 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627104 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a48b03b-402f-48b1-a3b7-52690850de42-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627245 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a48b03b-402f-48b1-a3b7-52690850de42-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627291 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627400 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627456 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-registry-tls\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627537 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfwqg\" (UniqueName: \"kubernetes.io/projected/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-kube-api-access-kfwqg\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627631 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g474n\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-kube-api-access-g474n\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627674 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-trusted-ca\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.627954 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-registry-certificates\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.628059 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.628098 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.128075438 +0000 UTC m=+134.616641781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.628221 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6lp\" (UniqueName: \"kubernetes.io/projected/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-kube-api-access-zd6lp\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.628340 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-bound-sa-token\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.628492 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.628668 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-srv-cert\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.630331 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.646190 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.666541 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.686420 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.707050 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.725996 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.730934 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.731280 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.231224851 +0000 UTC m=+134.719791024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.731491 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whxb2\" (UniqueName: \"kubernetes.io/projected/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-kube-api-access-whxb2\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.731746 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.731958 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s8mw\" (UniqueName: \"kubernetes.io/projected/7933e531-8015-4c1a-ba84-fe22aa094da0-kube-api-access-2s8mw\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732162 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-serving-cert\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732218 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3266e10d-4bcd-4db0-aa15-53be39ecf437-serving-cert\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732276 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-config\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732334 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732443 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-serving-cert\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732667 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qsr\" (UniqueName: \"kubernetes.io/projected/7b26d5bf-8529-4934-9b4f-96792ae9e62f-kube-api-access-l9qsr\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732755 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-policies\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.732892 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843619ff-c0f4-4389-b3ef-62c282b20303-service-ca-bundle\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.733094 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzsr\" (UniqueName: \"kubernetes.io/projected/553e9a81-ff35-4dd6-bbb7-d55749d88e45-kube-api-access-hjzsr\") pod \"multus-admission-controller-857f4d67dd-9lh5d\" (UID: \"553e9a81-ff35-4dd6-bbb7-d55749d88e45\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.733245 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c559cce-7066-4c52-ad84-9c748243f010-profile-collector-cert\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734059 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-registry-tls\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734134 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/553e9a81-ff35-4dd6-bbb7-d55749d88e45-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9lh5d\" (UID: \"553e9a81-ff35-4dd6-bbb7-d55749d88e45\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734231 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-socket-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734379 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7b26d5bf-8529-4934-9b4f-96792ae9e62f-certs\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734434 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2rl\" (UniqueName: \"kubernetes.io/projected/3266e10d-4bcd-4db0-aa15-53be39ecf437-kube-api-access-kr2rl\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734490 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-metrics-certs\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734542 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f4a7b98f-600d-49af-9917-29b38e8877a4-tmpfs\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.734718 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-csi-data-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.735477 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.735579 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b1d27b-235a-4b1e-adaa-512f3ae25954-secret-volume\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.735753 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxqth\" (UniqueName: \"kubernetes.io/projected/e2376625-99df-4517-a64a-6c8b1d7edc20-kube-api-access-pxqth\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.736000 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-trusted-ca\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.736156 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.736330 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c46vr\" (UniqueName: \"kubernetes.io/projected/7c559cce-7066-4c52-ad84-9c748243f010-kube-api-access-c46vr\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.736965 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-registry-certificates\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.737095 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbdm\" (UniqueName: \"kubernetes.io/projected/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-kube-api-access-8vbdm\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.737259 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4a7b98f-600d-49af-9917-29b38e8877a4-webhook-cert\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.737635 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-trusted-ca\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.738275 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6lp\" (UniqueName: \"kubernetes.io/projected/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-kube-api-access-zd6lp\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.738426 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-registry-certificates\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.738897 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.739099 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.739788 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff38aa7-d0b3-455b-b0ca-1034fc06a182-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkdq6\" (UID: \"eff38aa7-d0b3-455b-b0ca-1034fc06a182\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.739921 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blnkq\" (UniqueName: \"kubernetes.io/projected/dca532ee-e66a-411a-afcc-646f96a22a62-kube-api-access-blnkq\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740102 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-srv-cert\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740169 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740222 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e2376625-99df-4517-a64a-6c8b1d7edc20-signing-cabundle\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740268 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f28cb7aa-489c-4015-860c-9d925da5f135-metrics-tls\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740318 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740376 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-console-config\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740423 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-default-certificate\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740523 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-oauth-serving-cert\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740646 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-ca\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740750 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jjc\" (UniqueName: \"kubernetes.io/projected/eff38aa7-d0b3-455b-b0ca-1034fc06a182-kube-api-access-m2jjc\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkdq6\" (UID: \"eff38aa7-d0b3-455b-b0ca-1034fc06a182\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740809 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-oauth-config\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740918 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a48b03b-402f-48b1-a3b7-52690850de42-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740957 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7933e531-8015-4c1a-ba84-fe22aa094da0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740984 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a48b03b-402f-48b1-a3b7-52690850de42-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.740968 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-registry-tls\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741011 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzzs\" (UniqueName: \"kubernetes.io/projected/f28cb7aa-489c-4015-860c-9d925da5f135-kube-api-access-8wzzs\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741065 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741096 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741137 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77-cert\") pod \"ingress-canary-c69wg\" (UID: \"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77\") " pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741157 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-client\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741241 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741291 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05f64e2f-791a-463e-8ec4-340732e212ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741313 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbjs\" (UniqueName: \"kubernetes.io/projected/e1c6d0f7-5a86-49fb-870d-991796812348-kube-api-access-rwbjs\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741319 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a48b03b-402f-48b1-a3b7-52690850de42-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741332 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-service-ca\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741377 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6frp\" (UniqueName: \"kubernetes.io/projected/60b58907-b6e9-4a6d-b442-9c79d839bac9-kube-api-access-d6frp\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741545 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7933e531-8015-4c1a-ba84-fe22aa094da0-metrics-tls\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741605 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741767 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-mountpoint-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.741963 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4a7b98f-600d-49af-9917-29b38e8877a4-apiservice-cert\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.742046 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.242004445 +0000 UTC m=+134.730570748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742136 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c559cce-7066-4c52-ad84-9c748243f010-srv-cert\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742224 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b1d27b-235a-4b1e-adaa-512f3ae25954-config-volume\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742395 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttpx\" (UniqueName: \"kubernetes.io/projected/b6b1d27b-235a-4b1e-adaa-512f3ae25954-kube-api-access-kttpx\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742463 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-trusted-ca\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742495 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742520 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68sv\" (UniqueName: \"kubernetes.io/projected/843619ff-c0f4-4389-b3ef-62c282b20303-kube-api-access-d68sv\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742555 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742582 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfwqg\" (UniqueName: \"kubernetes.io/projected/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-kube-api-access-kfwqg\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742601 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hblgp\" (UniqueName: \"kubernetes.io/projected/7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77-kube-api-access-hblgp\") pod \"ingress-canary-c69wg\" (UID: \"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77\") " pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742622 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-trusted-ca-bundle\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742639 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-dir\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742684 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-serving-cert\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.742961 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-stats-auth\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743055 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g474n\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-kube-api-access-g474n\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743084 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7b26d5bf-8529-4934-9b4f-96792ae9e62f-node-bootstrap-token\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743108 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743134 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-config\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743154 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7933e531-8015-4c1a-ba84-fe22aa094da0-trusted-ca\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743187 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-registration-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743210 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743259 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tr4r\" (UniqueName: \"kubernetes.io/projected/58900f48-68af-4712-bb30-9d832c26ce02-kube-api-access-5tr4r\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743280 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-service-ca\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743299 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743379 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-bound-sa-token\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743400 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkh4\" (UniqueName: \"kubernetes.io/projected/f4a7b98f-600d-49af-9917-29b38e8877a4-kube-api-access-qgkh4\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743418 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743440 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-config\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743464 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-plugins-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743481 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e2376625-99df-4517-a64a-6c8b1d7edc20-signing-key\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743501 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743521 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05f64e2f-791a-463e-8ec4-340732e212ee-config\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743539 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05f64e2f-791a-463e-8ec4-340732e212ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743556 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f28cb7aa-489c-4015-860c-9d925da5f135-config-volume\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.743572 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.745273 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.745945 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.748435 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.749091 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-srv-cert\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.750401 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.752296 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a48b03b-402f-48b1-a3b7-52690850de42-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.766007 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.806205 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.826470 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.845123 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.845405 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.345357344 +0000 UTC m=+134.833923467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.845503 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxqth\" (UniqueName: \"kubernetes.io/projected/e2376625-99df-4517-a64a-6c8b1d7edc20-kube-api-access-pxqth\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.845594 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.845643 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c46vr\" (UniqueName: \"kubernetes.io/projected/7c559cce-7066-4c52-ad84-9c748243f010-kube-api-access-c46vr\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.845761 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbdm\" (UniqueName: \"kubernetes.io/projected/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-kube-api-access-8vbdm\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.846084 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4a7b98f-600d-49af-9917-29b38e8877a4-webhook-cert\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.846285 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.846326 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.846381 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff38aa7-d0b3-455b-b0ca-1034fc06a182-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkdq6\" (UID: \"eff38aa7-d0b3-455b-b0ca-1034fc06a182\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.846409 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blnkq\" (UniqueName: \"kubernetes.io/projected/dca532ee-e66a-411a-afcc-646f96a22a62-kube-api-access-blnkq\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.846768 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847088 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847121 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e2376625-99df-4517-a64a-6c8b1d7edc20-signing-cabundle\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847146 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f28cb7aa-489c-4015-860c-9d925da5f135-metrics-tls\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847213 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-console-config\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847237 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-default-certificate\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847327 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-oauth-serving-cert\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847359 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-ca\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847399 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jjc\" (UniqueName: \"kubernetes.io/projected/eff38aa7-d0b3-455b-b0ca-1034fc06a182-kube-api-access-m2jjc\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkdq6\" (UID: \"eff38aa7-d0b3-455b-b0ca-1034fc06a182\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847430 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-oauth-config\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847476 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7933e531-8015-4c1a-ba84-fe22aa094da0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847500 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzzs\" (UniqueName: \"kubernetes.io/projected/f28cb7aa-489c-4015-860c-9d925da5f135-kube-api-access-8wzzs\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847524 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847555 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77-cert\") pod \"ingress-canary-c69wg\" (UID: \"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77\") " pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847577 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-client\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847599 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847627 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05f64e2f-791a-463e-8ec4-340732e212ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847650 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbjs\" (UniqueName: \"kubernetes.io/projected/e1c6d0f7-5a86-49fb-870d-991796812348-kube-api-access-rwbjs\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847673 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-service-ca\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847695 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6frp\" (UniqueName: \"kubernetes.io/projected/60b58907-b6e9-4a6d-b442-9c79d839bac9-kube-api-access-d6frp\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847728 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7933e531-8015-4c1a-ba84-fe22aa094da0-metrics-tls\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847759 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847782 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c559cce-7066-4c52-ad84-9c748243f010-srv-cert\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847800 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b1d27b-235a-4b1e-adaa-512f3ae25954-config-volume\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847829 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-mountpoint-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847847 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4a7b98f-600d-49af-9917-29b38e8877a4-apiservice-cert\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847868 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kttpx\" (UniqueName: \"kubernetes.io/projected/b6b1d27b-235a-4b1e-adaa-512f3ae25954-kube-api-access-kttpx\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.847885 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-trusted-ca\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848013 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848051 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68sv\" (UniqueName: \"kubernetes.io/projected/843619ff-c0f4-4389-b3ef-62c282b20303-kube-api-access-d68sv\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848107 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848142 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hblgp\" (UniqueName: \"kubernetes.io/projected/7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77-kube-api-access-hblgp\") pod \"ingress-canary-c69wg\" (UID: \"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77\") " pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848160 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-trusted-ca-bundle\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848178 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-dir\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848206 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-serving-cert\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848226 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-stats-auth\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848255 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7b26d5bf-8529-4934-9b4f-96792ae9e62f-node-bootstrap-token\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848276 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848297 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-config\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848313 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7933e531-8015-4c1a-ba84-fe22aa094da0-trusted-ca\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848337 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-registration-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848353 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848383 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tr4r\" (UniqueName: \"kubernetes.io/projected/58900f48-68af-4712-bb30-9d832c26ce02-kube-api-access-5tr4r\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848401 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-service-ca\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848441 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkh4\" (UniqueName: \"kubernetes.io/projected/f4a7b98f-600d-49af-9917-29b38e8877a4-kube-api-access-qgkh4\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848458 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848463 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e2376625-99df-4517-a64a-6c8b1d7edc20-signing-cabundle\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848479 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-config\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848538 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-plugins-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848576 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e2376625-99df-4517-a64a-6c8b1d7edc20-signing-key\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848603 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05f64e2f-791a-463e-8ec4-340732e212ee-config\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848631 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05f64e2f-791a-463e-8ec4-340732e212ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848657 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f28cb7aa-489c-4015-860c-9d925da5f135-config-volume\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848683 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848726 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whxb2\" (UniqueName: \"kubernetes.io/projected/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-kube-api-access-whxb2\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848756 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s8mw\" (UniqueName: \"kubernetes.io/projected/7933e531-8015-4c1a-ba84-fe22aa094da0-kube-api-access-2s8mw\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848784 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848813 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-serving-cert\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848839 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3266e10d-4bcd-4db0-aa15-53be39ecf437-serving-cert\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848870 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-config\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848895 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848921 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-serving-cert\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848950 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qsr\" (UniqueName: \"kubernetes.io/projected/7b26d5bf-8529-4934-9b4f-96792ae9e62f-kube-api-access-l9qsr\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.848980 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-policies\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849007 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843619ff-c0f4-4389-b3ef-62c282b20303-service-ca-bundle\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849062 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c559cce-7066-4c52-ad84-9c748243f010-profile-collector-cert\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849092 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzsr\" (UniqueName: \"kubernetes.io/projected/553e9a81-ff35-4dd6-bbb7-d55749d88e45-kube-api-access-hjzsr\") pod \"multus-admission-controller-857f4d67dd-9lh5d\" (UID: \"553e9a81-ff35-4dd6-bbb7-d55749d88e45\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849103 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-config\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849123 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/553e9a81-ff35-4dd6-bbb7-d55749d88e45-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9lh5d\" (UID: \"553e9a81-ff35-4dd6-bbb7-d55749d88e45\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849156 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-socket-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849186 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-metrics-certs\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849213 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7b26d5bf-8529-4934-9b4f-96792ae9e62f-certs\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849243 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2rl\" (UniqueName: \"kubernetes.io/projected/3266e10d-4bcd-4db0-aa15-53be39ecf437-kube-api-access-kr2rl\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849281 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f4a7b98f-600d-49af-9917-29b38e8877a4-tmpfs\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849308 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-csi-data-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849337 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849368 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b1d27b-235a-4b1e-adaa-512f3ae25954-secret-volume\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.850459 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.851508 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-config\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.851601 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.851681 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-trusted-ca-bundle\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.851759 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-dir\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.851981 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.852259 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-console-config\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.852650 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b1d27b-235a-4b1e-adaa-512f3ae25954-secret-volume\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.852707 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.852707 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-trusted-ca\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.849215 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-mountpoint-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.853790 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff38aa7-d0b3-455b-b0ca-1034fc06a182-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkdq6\" (UID: \"eff38aa7-d0b3-455b-b0ca-1034fc06a182\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.855000 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f28cb7aa-489c-4015-860c-9d925da5f135-metrics-tls\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.855207 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.855340 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c559cce-7066-4c52-ad84-9c748243f010-srv-cert\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.855445 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-csi-data-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.855472 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.355453186 +0000 UTC m=+134.844019309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.855794 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f4a7b98f-600d-49af-9917-29b38e8877a4-tmpfs\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856240 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-serving-cert\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856383 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-oauth-config\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856436 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856442 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-plugins-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856450 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-socket-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856563 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/58900f48-68af-4712-bb30-9d832c26ce02-registration-dir\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.856590 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.857240 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-config\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.857463 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-serving-cert\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.857724 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-policies\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.857851 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7933e531-8015-4c1a-ba84-fe22aa094da0-trusted-ca\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.857975 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05f64e2f-791a-463e-8ec4-340732e212ee-config\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.858630 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-service-ca\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.858706 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/843619ff-c0f4-4389-b3ef-62c282b20303-service-ca-bundle\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.858882 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.859246 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-metrics-certs\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.859419 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f28cb7aa-489c-4015-860c-9d925da5f135-config-volume\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.859626 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.859674 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.861112 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c559cce-7066-4c52-ad84-9c748243f010-profile-collector-cert\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.861364 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/553e9a81-ff35-4dd6-bbb7-d55749d88e45-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9lh5d\" (UID: \"553e9a81-ff35-4dd6-bbb7-d55749d88e45\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.861406 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-oauth-serving-cert\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.861496 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e2376625-99df-4517-a64a-6c8b1d7edc20-signing-key\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.861779 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.862458 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-serving-cert\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.862813 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05f64e2f-791a-463e-8ec4-340732e212ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.863216 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-stats-auth\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.864164 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/843619ff-c0f4-4389-b3ef-62c282b20303-default-certificate\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.864742 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7933e531-8015-4c1a-ba84-fe22aa094da0-metrics-tls\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.866058 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.866255 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.866628 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.871901 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-ca\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.885436 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.891013 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-service-ca\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.906210 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.926402 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.931616 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3266e10d-4bcd-4db0-aa15-53be39ecf437-etcd-client\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.946540 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.950669 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.950809 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.450781034 +0000 UTC m=+134.939347157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.951220 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3266e10d-4bcd-4db0-aa15-53be39ecf437-serving-cert\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.951944 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:35 crc kubenswrapper[4823]: E1216 06:57:35.952567 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.452544333 +0000 UTC m=+134.941110496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.966345 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 06:57:35 crc kubenswrapper[4823]: I1216 06:57:35.986266 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.006258 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.019828 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.032480 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.034523 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.045924 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.052850 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.053358 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.553327876 +0000 UTC m=+135.041893999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.053510 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.054157 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.554128974 +0000 UTC m=+135.042695097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.066432 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.073149 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4a7b98f-600d-49af-9917-29b38e8877a4-apiservice-cert\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.079388 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4a7b98f-600d-49af-9917-29b38e8877a4-webhook-cert\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.086158 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.090803 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b1d27b-235a-4b1e-adaa-512f3ae25954-config-volume\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.106092 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.141316 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxv6\" (UniqueName: \"kubernetes.io/projected/593f793a-bb15-4f83-8454-e3a1ced41667-kube-api-access-5dxv6\") pod \"cluster-samples-operator-665b6dd947-dpsl6\" (UID: \"593f793a-bb15-4f83-8454-e3a1ced41667\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.155121 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.155362 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.655329211 +0000 UTC m=+135.143895334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.155746 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.156221 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.65620168 +0000 UTC m=+135.144767973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.164490 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdw6v\" (UniqueName: \"kubernetes.io/projected/b7af89fb-e572-4c0d-a269-a65d03ac6e0e-kube-api-access-qdw6v\") pod \"apiserver-76f77b778f-s6mwx\" (UID: \"b7af89fb-e572-4c0d-a269-a65d03ac6e0e\") " pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.182125 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs7zv\" (UniqueName: \"kubernetes.io/projected/142b25e7-9ad6-4a22-8f1c-8bd280329db9-kube-api-access-rs7zv\") pod \"machine-approver-56656f9798-dmbvr\" (UID: \"142b25e7-9ad6-4a22-8f1c-8bd280329db9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.200944 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpf95\" (UniqueName: \"kubernetes.io/projected/6d29bdc9-59b6-460e-a725-2f731de32ec3-kube-api-access-fpf95\") pod \"authentication-operator-69f744f599-ss5zz\" (UID: \"6d29bdc9-59b6-460e-a725-2f731de32ec3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.216905 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.221166 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7hcz\" (UniqueName: \"kubernetes.io/projected/dfc57533-6490-47c8-8188-ad895f04811c-kube-api-access-k7hcz\") pod \"migrator-59844c95c7-jjrb4\" (UID: \"dfc57533-6490-47c8-8188-ad895f04811c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.230536 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.244663 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcbs\" (UniqueName: \"kubernetes.io/projected/3ec12da7-6ed9-4798-ba75-1b0c160dd126-kube-api-access-xfcbs\") pod \"openshift-apiserver-operator-796bbdcf4f-t9ztt\" (UID: \"3ec12da7-6ed9-4798-ba75-1b0c160dd126\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.257217 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.257406 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.757379326 +0000 UTC m=+135.245945449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.257733 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.258367 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.758343139 +0000 UTC m=+135.246909252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.263000 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22nz8\" (UniqueName: \"kubernetes.io/projected/ca0a37ba-9a04-4d90-8ee8-6797791303a4-kube-api-access-22nz8\") pod \"apiserver-7bbb656c7d-5rs6p\" (UID: \"ca0a37ba-9a04-4d90-8ee8-6797791303a4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.279464 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5v27\" (UniqueName: \"kubernetes.io/projected/410cd31d-8e32-4101-b933-2c7bd673c17f-kube-api-access-b5v27\") pod \"machine-config-operator-74547568cd-fxqpl\" (UID: \"410cd31d-8e32-4101-b933-2c7bd673c17f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.302913 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ws45\" (UniqueName: \"kubernetes.io/projected/2dea4f36-2ae5-4363-a65c-0b7346f02661-kube-api-access-4ws45\") pod \"route-controller-manager-6576b87f9c-plnfh\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.324509 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgdjh\" (UniqueName: \"kubernetes.io/projected/b76da243-83e6-4503-be17-ef252bff5a98-kube-api-access-cgdjh\") pod \"dns-operator-744455d44c-t7pwj\" (UID: \"b76da243-83e6-4503-be17-ef252bff5a98\") " pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.345358 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbws\" (UniqueName: \"kubernetes.io/projected/150075c3-d2eb-4c87-8b80-cd1d063e7d4c-kube-api-access-7gbws\") pod \"machine-api-operator-5694c8668f-bh4xp\" (UID: \"150075c3-d2eb-4c87-8b80-cd1d063e7d4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.352792 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.359955 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.360155 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.860127666 +0000 UTC m=+135.348693789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.360437 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.361134 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.861124999 +0000 UTC m=+135.349691122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.362463 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9w6b\" (UniqueName: \"kubernetes.io/projected/8aaa63b4-9b41-442f-b9ea-672885a486bd-kube-api-access-h9w6b\") pod \"controller-manager-879f6c89f-vmqj6\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.367352 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.373391 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.385572 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.397153 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.397546 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7b26d5bf-8529-4934-9b4f-96792ae9e62f-certs\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.397694 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.408460 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.413042 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.417736 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7b26d5bf-8529-4934-9b4f-96792ae9e62f-node-bootstrap-token\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.426594 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.428345 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ss5zz"] Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.431082 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77-cert\") pod \"ingress-canary-c69wg\" (UID: \"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77\") " pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:36 crc kubenswrapper[4823]: W1216 06:57:36.439735 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d29bdc9_59b6_460e_a725_2f731de32ec3.slice/crio-678136236f14a9a3904a206b785e07c5d973b7315fcf86bed6fc704626f88d2f WatchSource:0}: Error finding container 678136236f14a9a3904a206b785e07c5d973b7315fcf86bed6fc704626f88d2f: Status 404 returned error can't find the container with id 678136236f14a9a3904a206b785e07c5d973b7315fcf86bed6fc704626f88d2f Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.444438 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.446590 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.458687 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.461583 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.461725 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.961700715 +0000 UTC m=+135.450266838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.464456 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.464962 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:36.964949645 +0000 UTC m=+135.453515768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.465102 4823 request.go:700] Waited for 1.912143093s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.467054 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.472569 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.475055 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4"] Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.485769 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.503520 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.513295 4823 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.526270 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.554816 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.561872 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.570071 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.571873 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.07173962 +0000 UTC m=+135.560305743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.572641 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.573734 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.073725177 +0000 UTC m=+135.562291300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.593145 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7hjh\" (UniqueName: \"kubernetes.io/projected/8428636f-bcf3-4698-b121-1649ff94810f-kube-api-access-h7hjh\") pod \"package-server-manager-789f6589d5-5dbcj\" (UID: \"8428636f-bcf3-4698-b121-1649ff94810f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.602824 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" event={"ID":"142b25e7-9ad6-4a22-8f1c-8bd280329db9","Type":"ContainerStarted","Data":"3f947846ff19d1caf119015e2bc77b717b2ba3ee5c14f3278f796543f83b3e78"} Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.605466 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" event={"ID":"6d29bdc9-59b6-460e-a725-2f731de32ec3","Type":"ContainerStarted","Data":"678136236f14a9a3904a206b785e07c5d973b7315fcf86bed6fc704626f88d2f"} Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.607668 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" event={"ID":"dfc57533-6490-47c8-8188-ad895f04811c","Type":"ContainerStarted","Data":"de0970be7eee31e2838a63a6ce1df04b1685acdbc0f719f1e117a4d7639fa095"} Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.624727 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed1826b1-19af-4d50-b293-5c22c03fbfe7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l9qfp\" (UID: \"ed1826b1-19af-4d50-b293-5c22c03fbfe7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.624863 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctp67\" (UniqueName: \"kubernetes.io/projected/a4d1fadc-5068-4443-8ba2-9bbd80233db2-kube-api-access-ctp67\") pod \"downloads-7954f5f757-k2ljf\" (UID: \"a4d1fadc-5068-4443-8ba2-9bbd80233db2\") " pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.642400 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljnjf\" (UniqueName: \"kubernetes.io/projected/37c33a89-18c7-457a-a8ce-85c7721719fc-kube-api-access-ljnjf\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.673180 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.673670 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.673783 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.173755855 +0000 UTC m=+135.662321978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.673957 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.674551 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.174531131 +0000 UTC m=+135.663097264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.684482 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6"] Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.690073 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.692741 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs46m\" (UniqueName: \"kubernetes.io/projected/979cabf2-8f74-4c88-92e8-baaffd74d816-kube-api-access-xs46m\") pod \"openshift-config-operator-7777fb866f-jsk55\" (UID: \"979cabf2-8f74-4c88-92e8-baaffd74d816\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.704780 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.705221 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjv2\" (UniqueName: \"kubernetes.io/projected/dc6756f9-a794-4667-8143-bd14bedd0cc3-kube-api-access-9mjv2\") pod \"machine-config-controller-84d6567774-rtddw\" (UID: \"dc6756f9-a794-4667-8143-bd14bedd0cc3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.708222 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37c33a89-18c7-457a-a8ce-85c7721719fc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c4c5h\" (UID: \"37c33a89-18c7-457a-a8ce-85c7721719fc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.708949 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vmqj6"] Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.712839 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.728141 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbrsc\" (UniqueName: \"kubernetes.io/projected/9a66a086-81cf-498e-aada-d06b41019b1f-kube-api-access-wbrsc\") pod \"kube-storage-version-migrator-operator-b67b599dd-bdlsz\" (UID: \"9a66a086-81cf-498e-aada-d06b41019b1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.737685 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.756670 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6lp\" (UniqueName: \"kubernetes.io/projected/9125b2ca-3f9a-47d3-8422-cae6f85f36d1-kube-api-access-zd6lp\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xpl2\" (UID: \"9125b2ca-3f9a-47d3-8422-cae6f85f36d1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.771679 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfwqg\" (UniqueName: \"kubernetes.io/projected/ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac-kube-api-access-kfwqg\") pod \"olm-operator-6b444d44fb-5dtcg\" (UID: \"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.776288 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.776863 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.276847006 +0000 UTC m=+135.765413129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.792207 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g474n\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-kube-api-access-g474n\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.814508 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-bound-sa-token\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.842685 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c46vr\" (UniqueName: \"kubernetes.io/projected/7c559cce-7066-4c52-ad84-9c748243f010-kube-api-access-c46vr\") pod \"catalog-operator-68c6474976-llvdl\" (UID: \"7c559cce-7066-4c52-ad84-9c748243f010\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.883696 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.884652 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.384621525 +0000 UTC m=+135.873187648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.889132 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxqth\" (UniqueName: \"kubernetes.io/projected/e2376625-99df-4517-a64a-6c8b1d7edc20-kube-api-access-pxqth\") pod \"service-ca-9c57cc56f-mgbxj\" (UID: \"e2376625-99df-4517-a64a-6c8b1d7edc20\") " pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.917154 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1c5f775-cbe4-44d3-8ef3-19dd69e27b47-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2glj6\" (UID: \"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.924081 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blnkq\" (UniqueName: \"kubernetes.io/projected/dca532ee-e66a-411a-afcc-646f96a22a62-kube-api-access-blnkq\") pod \"marketplace-operator-79b997595-thj57\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.946710 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttpx\" (UniqueName: \"kubernetes.io/projected/b6b1d27b-235a-4b1e-adaa-512f3ae25954-kube-api-access-kttpx\") pod \"collect-profiles-29431125-j4w8x\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.960202 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whxb2\" (UniqueName: \"kubernetes.io/projected/10d2c2db-5e69-4c14-92be-c6c7f6c9b04a-kube-api-access-whxb2\") pod \"service-ca-operator-777779d784-jmp8w\" (UID: \"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.970549 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.981778 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.986770 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.987324 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.487265231 +0000 UTC m=+135.975831354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.987633 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:36 crc kubenswrapper[4823]: E1216 06:57:36.988384 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.488373998 +0000 UTC m=+135.976940121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.991942 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbjs\" (UniqueName: \"kubernetes.io/projected/e1c6d0f7-5a86-49fb-870d-991796812348-kube-api-access-rwbjs\") pod \"console-f9d7485db-bx552\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:36 crc kubenswrapper[4823]: I1216 06:57:36.997288 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.006146 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2rl\" (UniqueName: \"kubernetes.io/projected/3266e10d-4bcd-4db0-aa15-53be39ecf437-kube-api-access-kr2rl\") pod \"etcd-operator-b45778765-4wgv6\" (UID: \"3266e10d-4bcd-4db0-aa15-53be39ecf437\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.021728 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.038135 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7933e531-8015-4c1a-ba84-fe22aa094da0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.038620 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.044084 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.045755 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.046476 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jjc\" (UniqueName: \"kubernetes.io/projected/eff38aa7-d0b3-455b-b0ca-1034fc06a182-kube-api-access-m2jjc\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkdq6\" (UID: \"eff38aa7-d0b3-455b-b0ca-1034fc06a182\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.056333 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bh4xp"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.063343 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qsr\" (UniqueName: \"kubernetes.io/projected/7b26d5bf-8529-4934-9b4f-96792ae9e62f-kube-api-access-l9qsr\") pod \"machine-config-server-6bwk5\" (UID: \"7b26d5bf-8529-4934-9b4f-96792ae9e62f\") " pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.071133 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.084209 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzsr\" (UniqueName: \"kubernetes.io/projected/553e9a81-ff35-4dd6-bbb7-d55749d88e45-kube-api-access-hjzsr\") pod \"multus-admission-controller-857f4d67dd-9lh5d\" (UID: \"553e9a81-ff35-4dd6-bbb7-d55749d88e45\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.087149 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.088544 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.088741 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.588713476 +0000 UTC m=+136.077279599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.088962 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.089598 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.589577815 +0000 UTC m=+136.078143938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.101101 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hblgp\" (UniqueName: \"kubernetes.io/projected/7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77-kube-api-access-hblgp\") pod \"ingress-canary-c69wg\" (UID: \"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77\") " pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.109016 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.122872 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.124738 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s8mw\" (UniqueName: \"kubernetes.io/projected/7933e531-8015-4c1a-ba84-fe22aa094da0-kube-api-access-2s8mw\") pod \"ingress-operator-5b745b69d9-g4ltp\" (UID: \"7933e531-8015-4c1a-ba84-fe22aa094da0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.137107 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.142326 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkh4\" (UniqueName: \"kubernetes.io/projected/f4a7b98f-600d-49af-9917-29b38e8877a4-kube-api-access-qgkh4\") pod \"packageserver-d55dfcdfc-d9xfk\" (UID: \"f4a7b98f-600d-49af-9917-29b38e8877a4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.144227 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.144616 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.149289 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s6mwx"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.153136 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.158266 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.163735 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tr4r\" (UniqueName: \"kubernetes.io/projected/58900f48-68af-4712-bb30-9d832c26ce02-kube-api-access-5tr4r\") pod \"csi-hostpathplugin-4bxzg\" (UID: \"58900f48-68af-4712-bb30-9d832c26ce02\") " pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.168510 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.177806 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6bwk5" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.182752 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05f64e2f-791a-463e-8ec4-340732e212ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qqqjd\" (UID: \"05f64e2f-791a-463e-8ec4-340732e212ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.186349 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c69wg" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.190745 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.190944 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.690919027 +0000 UTC m=+136.179485150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.191135 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.191480 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.691473056 +0000 UTC m=+136.180039179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.200827 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6frp\" (UniqueName: \"kubernetes.io/projected/60b58907-b6e9-4a6d-b442-9c79d839bac9-kube-api-access-d6frp\") pod \"oauth-openshift-558db77b4-6xfbm\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.207719 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.209034 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.210181 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.230346 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzzs\" (UniqueName: \"kubernetes.io/projected/f28cb7aa-489c-4015-860c-9d925da5f135-kube-api-access-8wzzs\") pod \"dns-default-rmzm5\" (UID: \"f28cb7aa-489c-4015-860c-9d925da5f135\") " pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.235786 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-t7pwj"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.243224 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68sv\" (UniqueName: \"kubernetes.io/projected/843619ff-c0f4-4389-b3ef-62c282b20303-kube-api-access-d68sv\") pod \"router-default-5444994796-lpkx6\" (UID: \"843619ff-c0f4-4389-b3ef-62c282b20303\") " pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.245912 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.292129 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.292426 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.792381283 +0000 UTC m=+136.280947406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.292562 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.292974 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.792959502 +0000 UTC m=+136.281525615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.333492 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-k2ljf"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.336358 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jsk55"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.349954 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj"] Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.351722 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbdm\" (UniqueName: \"kubernetes.io/projected/f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955-kube-api-access-8vbdm\") pod \"console-operator-58897d9998-vnw5c\" (UID: \"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955\") " pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.352911 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.360712 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.361094 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150075c3_d2eb_4c87_8b80_cd1d063e7d4c.slice/crio-31c826bd538137684b71ab73fd2d60ce9c0926ae6d65198b25f5368d13ebc3f4 WatchSource:0}: Error finding container 31c826bd538137684b71ab73fd2d60ce9c0926ae6d65198b25f5368d13ebc3f4: Status 404 returned error can't find the container with id 31c826bd538137684b71ab73fd2d60ce9c0926ae6d65198b25f5368d13ebc3f4 Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.378899 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.393766 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.393996 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.394084 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.894055215 +0000 UTC m=+136.382621348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.394337 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.394993 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.894954206 +0000 UTC m=+136.383520329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.401362 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.416915 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.430959 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.440746 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec12da7_6ed9_4798_ba75_1b0c160dd126.slice/crio-bdaf9a5ccb9dec5d8cfad24c4aa4c75d2c0f8c953dd67130afcd9d6ce513de61 WatchSource:0}: Error finding container bdaf9a5ccb9dec5d8cfad24c4aa4c75d2c0f8c953dd67130afcd9d6ce513de61: Status 404 returned error can't find the container with id bdaf9a5ccb9dec5d8cfad24c4aa4c75d2c0f8c953dd67130afcd9d6ce513de61 Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.446673 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7af89fb_e572_4c0d_a269_a65d03ac6e0e.slice/crio-270dfa070e748e9b6c843d835789d8e37ba9f456e1246a2de144dbc5590d4145 WatchSource:0}: Error finding container 270dfa070e748e9b6c843d835789d8e37ba9f456e1246a2de144dbc5590d4145: Status 404 returned error can't find the container with id 270dfa070e748e9b6c843d835789d8e37ba9f456e1246a2de144dbc5590d4145 Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.447492 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6756f9_a794_4667_8143_bd14bedd0cc3.slice/crio-14c488c527c800d19d1da62b70b336e3ef0ab2c73575884e3250e3038d7dc084 WatchSource:0}: Error finding container 14c488c527c800d19d1da62b70b336e3ef0ab2c73575884e3250e3038d7dc084: Status 404 returned error can't find the container with id 14c488c527c800d19d1da62b70b336e3ef0ab2c73575884e3250e3038d7dc084 Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.495485 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.495736 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.995710668 +0000 UTC m=+136.484276801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.495845 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.496177 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:37.996165984 +0000 UTC m=+136.484732107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.529690 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod410cd31d_8e32_4101_b933_2c7bd673c17f.slice/crio-65078c4c018eea89028713fcbbe8a9f6de9910b161403314676e5211780159c8 WatchSource:0}: Error finding container 65078c4c018eea89028713fcbbe8a9f6de9910b161403314676e5211780159c8: Status 404 returned error can't find the container with id 65078c4c018eea89028713fcbbe8a9f6de9910b161403314676e5211780159c8 Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.531107 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb76da243_83e6_4503_be17_ef252bff5a98.slice/crio-94a64ee30415b35f0d6dec7a1f3c35a134ffcb296ddfa0a39b7cf58bbbb4aeed WatchSource:0}: Error finding container 94a64ee30415b35f0d6dec7a1f3c35a134ffcb296ddfa0a39b7cf58bbbb4aeed: Status 404 returned error can't find the container with id 94a64ee30415b35f0d6dec7a1f3c35a134ffcb296ddfa0a39b7cf58bbbb4aeed Dec 16 06:57:37 crc kubenswrapper[4823]: W1216 06:57:37.531600 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d1fadc_5068_4443_8ba2_9bbd80233db2.slice/crio-582c08a53218edbee548584c078aa46df432f239f3b5cd69a39662b29fc434f7 WatchSource:0}: Error finding container 582c08a53218edbee548584c078aa46df432f239f3b5cd69a39662b29fc434f7: Status 404 returned error can't find the container with id 582c08a53218edbee548584c078aa46df432f239f3b5cd69a39662b29fc434f7 Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.597198 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.597706 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.097666411 +0000 UTC m=+136.586232534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.617680 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" event={"ID":"ca0a37ba-9a04-4d90-8ee8-6797791303a4","Type":"ContainerStarted","Data":"e489ae3616370eb47ba317a8563f8cdac28afdfc853906a00a2c1d3500a8c21b"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.619568 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" event={"ID":"593f793a-bb15-4f83-8454-e3a1ced41667","Type":"ContainerStarted","Data":"52791a97b6c9d4e1f7e229886e3b53aa8f52a025399599d4e3fe912f8bdc9658"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.620834 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" event={"ID":"ed1826b1-19af-4d50-b293-5c22c03fbfe7","Type":"ContainerStarted","Data":"80b5803b1b47746d5ab0636da349e3e61cf31982162c393dc0234bb80776220b"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.621997 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" event={"ID":"2dea4f36-2ae5-4363-a65c-0b7346f02661","Type":"ContainerStarted","Data":"c15ceb2510f7cae18f2baf442ccc0b6f0266ade5b99fa67a6094fb205da607b6"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.623314 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" event={"ID":"410cd31d-8e32-4101-b933-2c7bd673c17f","Type":"ContainerStarted","Data":"65078c4c018eea89028713fcbbe8a9f6de9910b161403314676e5211780159c8"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.624711 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" event={"ID":"979cabf2-8f74-4c88-92e8-baaffd74d816","Type":"ContainerStarted","Data":"312daf3f5ccf64de84fd50b6326762d64f1ddd25ee77ad615d1a0f4db6a2c879"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.626080 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" event={"ID":"8aaa63b4-9b41-442f-b9ea-672885a486bd","Type":"ContainerStarted","Data":"8797894f3a0c32ef8d0aaacde20ea2b04c1c9017f1fd9a184b071121941c3b9b"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.627199 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" event={"ID":"142b25e7-9ad6-4a22-8f1c-8bd280329db9","Type":"ContainerStarted","Data":"756a8a3fe0fd97e43ce5b5685e2ad7f6b96c87e5bfbf4ecf04b4e05233b12b6b"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.628368 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" event={"ID":"b76da243-83e6-4503-be17-ef252bff5a98","Type":"ContainerStarted","Data":"94a64ee30415b35f0d6dec7a1f3c35a134ffcb296ddfa0a39b7cf58bbbb4aeed"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.630305 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" event={"ID":"3ec12da7-6ed9-4798-ba75-1b0c160dd126","Type":"ContainerStarted","Data":"bdaf9a5ccb9dec5d8cfad24c4aa4c75d2c0f8c953dd67130afcd9d6ce513de61"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.631735 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" event={"ID":"8428636f-bcf3-4698-b121-1649ff94810f","Type":"ContainerStarted","Data":"c4d0faac461b7e4aa84a9dd9e3cdd88682abdbb2d361ddb121bbbbede26d5d52"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.634384 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" event={"ID":"dfc57533-6490-47c8-8188-ad895f04811c","Type":"ContainerStarted","Data":"ee51c8366a44b5e9041ced4e54d4dca6dd000717a4ed353df6b1d6a226ad0891"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.635186 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k2ljf" event={"ID":"a4d1fadc-5068-4443-8ba2-9bbd80233db2","Type":"ContainerStarted","Data":"582c08a53218edbee548584c078aa46df432f239f3b5cd69a39662b29fc434f7"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.635998 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" event={"ID":"dc6756f9-a794-4667-8143-bd14bedd0cc3","Type":"ContainerStarted","Data":"14c488c527c800d19d1da62b70b336e3ef0ab2c73575884e3250e3038d7dc084"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.636931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" event={"ID":"6d29bdc9-59b6-460e-a725-2f731de32ec3","Type":"ContainerStarted","Data":"5caf327b602c009c08b9427ed2ca532a92100fb385e55c88fec03be96a63f1f5"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.641387 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" event={"ID":"b7af89fb-e572-4c0d-a269-a65d03ac6e0e","Type":"ContainerStarted","Data":"270dfa070e748e9b6c843d835789d8e37ba9f456e1246a2de144dbc5590d4145"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.643041 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" event={"ID":"150075c3-d2eb-4c87-8b80-cd1d063e7d4c","Type":"ContainerStarted","Data":"31c826bd538137684b71ab73fd2d60ce9c0926ae6d65198b25f5368d13ebc3f4"} Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.699198 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.700582 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.200560095 +0000 UTC m=+136.689126398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.801366 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.801715 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.301676669 +0000 UTC m=+136.790242802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.802230 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.802679 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.302666362 +0000 UTC m=+136.791232485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:37 crc kubenswrapper[4823]: I1216 06:57:37.906111 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:37 crc kubenswrapper[4823]: E1216 06:57:37.906613 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.406584331 +0000 UTC m=+136.895150454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.007650 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.008540 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.508523493 +0000 UTC m=+136.997089616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: W1216 06:57:38.087380 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b26d5bf_8529_4934_9b4f_96792ae9e62f.slice/crio-45a83f60c04a2b755df9a482a321e198726611bd812291e2b0aac82cb446610b WatchSource:0}: Error finding container 45a83f60c04a2b755df9a482a321e198726611bd812291e2b0aac82cb446610b: Status 404 returned error can't find the container with id 45a83f60c04a2b755df9a482a321e198726611bd812291e2b0aac82cb446610b Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.110992 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.111637 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.611607684 +0000 UTC m=+137.100173817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.213374 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.214109 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.714086373 +0000 UTC m=+137.202652506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.219353 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ss5zz" podStartSLOduration=119.219324771 podStartE2EDuration="1m59.219324771s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:38.217710456 +0000 UTC m=+136.706276579" watchObservedRunningTime="2025-12-16 06:57:38.219324771 +0000 UTC m=+136.707890894" Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.295008 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6"] Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.316203 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.316528 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.816513372 +0000 UTC m=+137.305079495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.424948 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.425564 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:38.925537653 +0000 UTC m=+137.414103776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.527276 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.527445 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.027418113 +0000 UTC m=+137.515984236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.527581 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.527977 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.027969051 +0000 UTC m=+137.516535174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: W1216 06:57:38.595129 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c5f775_cbe4_44d3_8ef3_19dd69e27b47.slice/crio-f67bf29a5dda904c914ed1c9faeb38e12da1eab3dda1644f9537e191174d4a4c WatchSource:0}: Error finding container f67bf29a5dda904c914ed1c9faeb38e12da1eab3dda1644f9537e191174d4a4c: Status 404 returned error can't find the container with id f67bf29a5dda904c914ed1c9faeb38e12da1eab3dda1644f9537e191174d4a4c Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.628616 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.629098 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.129074215 +0000 UTC m=+137.617640328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.657695 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6bwk5" event={"ID":"7b26d5bf-8529-4934-9b4f-96792ae9e62f","Type":"ContainerStarted","Data":"45a83f60c04a2b755df9a482a321e198726611bd812291e2b0aac82cb446610b"} Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.661657 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" event={"ID":"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47","Type":"ContainerStarted","Data":"f67bf29a5dda904c914ed1c9faeb38e12da1eab3dda1644f9537e191174d4a4c"} Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.662934 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" event={"ID":"8aaa63b4-9b41-442f-b9ea-672885a486bd","Type":"ContainerStarted","Data":"d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8"} Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.664338 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.666367 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" event={"ID":"8428636f-bcf3-4698-b121-1649ff94810f","Type":"ContainerStarted","Data":"0a6e07ef51e886b3be3b798cc386a2ab049c1e029f59ed54e8bc1fe01f5b8f9d"} Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.668932 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lpkx6" event={"ID":"843619ff-c0f4-4389-b3ef-62c282b20303","Type":"ContainerStarted","Data":"7dc7e71d87b81d0fed7e71bd3dfabc6f9035e63f906b208566fbd1e7619ee20f"} Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.684385 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.730142 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.730731 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.230718128 +0000 UTC m=+137.719284251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.840640 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.841165 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.341143556 +0000 UTC m=+137.829709679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.901345 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" podStartSLOduration=119.901320658 podStartE2EDuration="1m59.901320658s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:38.898817884 +0000 UTC m=+137.387384007" watchObservedRunningTime="2025-12-16 06:57:38.901320658 +0000 UTC m=+137.389886781" Dec 16 06:57:38 crc kubenswrapper[4823]: I1216 06:57:38.943422 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:38 crc kubenswrapper[4823]: E1216 06:57:38.944273 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.444256968 +0000 UTC m=+137.932823091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.045114 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.045493 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.545470845 +0000 UTC m=+138.034036968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.146848 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.147467 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.647446948 +0000 UTC m=+138.136013071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.248246 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.249243 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.749220815 +0000 UTC m=+138.237786938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.352112 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.352694 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.852668338 +0000 UTC m=+138.341234531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.389988 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-thj57"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.454290 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.454686 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:39.954664381 +0000 UTC m=+138.443230504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.559379 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.559928 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.059906925 +0000 UTC m=+138.548473048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.661633 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.662223 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.162200898 +0000 UTC m=+138.650767011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.662291 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.662607 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.162599842 +0000 UTC m=+138.651165965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.684445 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vnw5c"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.721975 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.764931 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.765789 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.265761135 +0000 UTC m=+138.754327258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.846895 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" event={"ID":"dca532ee-e66a-411a-afcc-646f96a22a62","Type":"ContainerStarted","Data":"827f50daeb452bc4e72dd43443df535abd767d609a2be17165da72edc21f52ab"} Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.846957 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rmzm5"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.871613 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.871990 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.371977331 +0000 UTC m=+138.860543454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.889557 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.910593 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lpkx6" podStartSLOduration=120.910570664 podStartE2EDuration="2m0.910570664s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:39.909187188 +0000 UTC m=+138.397753311" watchObservedRunningTime="2025-12-16 06:57:39.910570664 +0000 UTC m=+138.399136787" Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.929725 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.938824 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c69wg"] Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.958609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" event={"ID":"3ec12da7-6ed9-4798-ba75-1b0c160dd126","Type":"ContainerStarted","Data":"ddb27ae9b12d197b534eb14b421daaa578aaee8ece1bac31d73c1ba3dd330eb9"} Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.972539 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.973085 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.473042904 +0000 UTC m=+138.961609027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:39 crc kubenswrapper[4823]: I1216 06:57:39.973423 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:39 crc kubenswrapper[4823]: E1216 06:57:39.975359 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.475342371 +0000 UTC m=+138.963908494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.003118 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" event={"ID":"150075c3-d2eb-4c87-8b80-cd1d063e7d4c","Type":"ContainerStarted","Data":"a103a9d48e38c68d76b5f8160d0ee8aac93a65ec8b5866e9aedaf352b1b2c348"} Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.013611 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.018524 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.019485 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.039006 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4bxzg"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.049644 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mgbxj"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.057687 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bx552"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.077679 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.079247 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.579228739 +0000 UTC m=+139.067794862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.085075 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6xfbm"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.099136 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.105252 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.125688 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9lh5d"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.129273 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.136417 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t9ztt" podStartSLOduration=121.135970815 podStartE2EDuration="2m1.135970815s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.076346052 +0000 UTC m=+138.564912175" watchObservedRunningTime="2025-12-16 06:57:40.135970815 +0000 UTC m=+138.624536938" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.147470 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.148975 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.153284 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" podStartSLOduration=121.153266199 podStartE2EDuration="2m1.153266199s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.147503385 +0000 UTC m=+138.636069508" watchObservedRunningTime="2025-12-16 06:57:40.153266199 +0000 UTC m=+138.641832322" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.153716 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4wgv6"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.160733 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" event={"ID":"dc6756f9-a794-4667-8143-bd14bedd0cc3","Type":"ContainerStarted","Data":"65f8a25fda4f8c7a19248cf12b3bae26bb33773b918568e3a3f4b026f312e584"} Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.169759 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w"] Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.179174 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.189087 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" event={"ID":"979cabf2-8f74-4c88-92e8-baaffd74d816","Type":"ContainerStarted","Data":"cb4cbfd988decff4efdd83632792ede7591bb93fb23cd852fc6d53f86540cab9"} Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.189368 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.689353808 +0000 UTC m=+139.177919931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.190912 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" podStartSLOduration=121.190885439 podStartE2EDuration="2m1.190885439s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.188335153 +0000 UTC m=+138.676901276" watchObservedRunningTime="2025-12-16 06:57:40.190885439 +0000 UTC m=+138.679451562" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.243446 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6bwk5" podStartSLOduration=6.243426754 podStartE2EDuration="6.243426754s" podCreationTimestamp="2025-12-16 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.243243158 +0000 UTC m=+138.731809281" watchObservedRunningTime="2025-12-16 06:57:40.243426754 +0000 UTC m=+138.731992877" Dec 16 06:57:40 crc kubenswrapper[4823]: W1216 06:57:40.243928 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f64e2f_791a_463e_8ec4_340732e212ee.slice/crio-f4583ccea4386601f148f3953f02fe772ea1a947a810ad32f1cbaded52017d5b WatchSource:0}: Error finding container f4583ccea4386601f148f3953f02fe772ea1a947a810ad32f1cbaded52017d5b: Status 404 returned error can't find the container with id f4583ccea4386601f148f3953f02fe772ea1a947a810ad32f1cbaded52017d5b Dec 16 06:57:40 crc kubenswrapper[4823]: W1216 06:57:40.270761 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553e9a81_ff35_4dd6_bbb7_d55749d88e45.slice/crio-91e5d68c94f5354d904f0378c214f3f88ada2c93598821e0547f2bfc66813f47 WatchSource:0}: Error finding container 91e5d68c94f5354d904f0378c214f3f88ada2c93598821e0547f2bfc66813f47: Status 404 returned error can't find the container with id 91e5d68c94f5354d904f0378c214f3f88ada2c93598821e0547f2bfc66813f47 Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.271621 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k2ljf" event={"ID":"a4d1fadc-5068-4443-8ba2-9bbd80233db2","Type":"ContainerStarted","Data":"e405db2792a7a53b67ccbb1ca8805111d5e618226b246051457a8413b0c1fcc7"} Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.272853 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.286910 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.288940 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.788907169 +0000 UTC m=+139.277473292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.302045 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.302094 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k2ljf" podUID="a4d1fadc-5068-4443-8ba2-9bbd80233db2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.332139 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" podStartSLOduration=121.332076287 podStartE2EDuration="2m1.332076287s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.331326991 +0000 UTC m=+138.819893114" watchObservedRunningTime="2025-12-16 06:57:40.332076287 +0000 UTC m=+138.820642410" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.333155 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.335410 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" podStartSLOduration=121.334917053 podStartE2EDuration="2m1.334917053s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.273113546 +0000 UTC m=+138.761679679" watchObservedRunningTime="2025-12-16 06:57:40.334917053 +0000 UTC m=+138.823483166" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.362510 4823 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-plnfh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.362575 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" podUID="2dea4f36-2ae5-4363-a65c-0b7346f02661" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.404151 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.459908 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-k2ljf" podStartSLOduration=121.459880772 podStartE2EDuration="2m1.459880772s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.373613199 +0000 UTC m=+138.862179332" watchObservedRunningTime="2025-12-16 06:57:40.459880772 +0000 UTC m=+138.948446895" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.460242 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.466240 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:40.966224966 +0000 UTC m=+139.454791089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.466603 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:40 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:40 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:40 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.466636 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.523413 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" podStartSLOduration=121.523395147 podStartE2EDuration="2m1.523395147s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.475815219 +0000 UTC m=+138.964381342" watchObservedRunningTime="2025-12-16 06:57:40.523395147 +0000 UTC m=+139.011961270" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.563782 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.564897 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.064876857 +0000 UTC m=+139.553442980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.638950 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" podStartSLOduration=120.638932287 podStartE2EDuration="2m0.638932287s" podCreationTimestamp="2025-12-16 06:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.524730851 +0000 UTC m=+139.013296974" watchObservedRunningTime="2025-12-16 06:57:40.638932287 +0000 UTC m=+139.127498410" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.666230 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.669017 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.169001922 +0000 UTC m=+139.657568045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.757401 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" podStartSLOduration=121.757377997 podStartE2EDuration="2m1.757377997s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:40.754412426 +0000 UTC m=+139.242978549" watchObservedRunningTime="2025-12-16 06:57:40.757377997 +0000 UTC m=+139.245944120" Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.773668 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.774179 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.274156623 +0000 UTC m=+139.762722756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.887798 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.888170 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.388157432 +0000 UTC m=+139.876723555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:40 crc kubenswrapper[4823]: I1216 06:57:40.993562 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:40 crc kubenswrapper[4823]: E1216 06:57:40.995854 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.495829868 +0000 UTC m=+139.984396001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.122161 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.122531 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.622516385 +0000 UTC m=+140.111082508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.230613 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.231873 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.731847576 +0000 UTC m=+140.220413699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.232094 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.232552 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.73254326 +0000 UTC m=+140.221109383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.333121 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.333988 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.833966895 +0000 UTC m=+140.322533018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.414810 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:41 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:41 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:41 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.415308 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.441674 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.442511 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:41.942495249 +0000 UTC m=+140.431061372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.555817 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.556544 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.056523799 +0000 UTC m=+140.545089922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.658567 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.658983 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.158967149 +0000 UTC m=+140.647533272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.699842 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" event={"ID":"f4a7b98f-600d-49af-9917-29b38e8877a4","Type":"ContainerStarted","Data":"12f59be41d51d6918c7d0493df3f5b174c7db30bff98647f40635f27c27ee2b5"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.706463 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" event={"ID":"b76da243-83e6-4503-be17-ef252bff5a98","Type":"ContainerStarted","Data":"39b845228fd4117e4eb0554efebab2ed11f958a1d0768d96533c338cc7e43182"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.706514 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" event={"ID":"b76da243-83e6-4503-be17-ef252bff5a98","Type":"ContainerStarted","Data":"ae1b84df3e3f175a4711aeedc1c08b958676ee4bf76d6ee723194de329789965"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.742597 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-t7pwj" podStartSLOduration=122.742577822 podStartE2EDuration="2m2.742577822s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:41.740591435 +0000 UTC m=+140.229157578" watchObservedRunningTime="2025-12-16 06:57:41.742577822 +0000 UTC m=+140.231143935" Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.762611 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.764206 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.26417443 +0000 UTC m=+140.752740553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.868637 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jjrb4" event={"ID":"dfc57533-6490-47c8-8188-ad895f04811c","Type":"ContainerStarted","Data":"6e7c2028a024ae699bd4fabaf698d5b19b5a257d1a9b88ed983ec678c6bf21fa"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.868937 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6bwk5" event={"ID":"7b26d5bf-8529-4934-9b4f-96792ae9e62f","Type":"ContainerStarted","Data":"c496680936c957a15382fdbdf2cf8f06e31354c382b6c81de97e4d23aa7cdaeb"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.869068 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" event={"ID":"60b58907-b6e9-4a6d-b442-9c79d839bac9","Type":"ContainerStarted","Data":"d27c550266359180bf82b7adc42e7107fc9b7a4bf212f5a3366fd5fbb7fce0a7"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.871277 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.873665 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.373644747 +0000 UTC m=+140.862210940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.908196 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" event={"ID":"150075c3-d2eb-4c87-8b80-cd1d063e7d4c","Type":"ContainerStarted","Data":"d44a4f14bae54a789e0ffb5fddc90ee83c785aa74ea448fa4a04028e62625bd7"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.916714 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bx552" event={"ID":"e1c6d0f7-5a86-49fb-870d-991796812348","Type":"ContainerStarted","Data":"edb82a2a1350064ecff291dc689957660c671b14b7e3ebff243b4f9a4e7e3c89"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.973535 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" event={"ID":"eff38aa7-d0b3-455b-b0ca-1034fc06a182","Type":"ContainerStarted","Data":"0fe7efe07defcd977dc618b36177728e0ce9cafee74bc3578b6c52ddb0862630"} Dec 16 06:57:41 crc kubenswrapper[4823]: I1216 06:57:41.974259 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:41 crc kubenswrapper[4823]: E1216 06:57:41.975632 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.47560865 +0000 UTC m=+140.964174813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.037302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" event={"ID":"8428636f-bcf3-4698-b121-1649ff94810f","Type":"ContainerStarted","Data":"9747708ba513f258a6a004eaf67b4844c99aaf48ed70147be8dd42052f422815"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.076513 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.076931 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.576917411 +0000 UTC m=+141.065483534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.078395 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" event={"ID":"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac","Type":"ContainerStarted","Data":"053ea411fa5ce5e84bf6f376034ae4b0b6dc2a9f5eceaba3c37a501bf9e22d12"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.103153 4823 generic.go:334] "Generic (PLEG): container finished" podID="979cabf2-8f74-4c88-92e8-baaffd74d816" containerID="cb4cbfd988decff4efdd83632792ede7591bb93fb23cd852fc6d53f86540cab9" exitCode=0 Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.103232 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" event={"ID":"979cabf2-8f74-4c88-92e8-baaffd74d816","Type":"ContainerDied","Data":"cb4cbfd988decff4efdd83632792ede7591bb93fb23cd852fc6d53f86540cab9"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.132720 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" event={"ID":"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955","Type":"ContainerStarted","Data":"c364b07d759faa288ed650f9f4ee594d8197300045c9a07a1c4ee5dcf80664dd"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.132782 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" event={"ID":"f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955","Type":"ContainerStarted","Data":"87b49201eac808a03487d4b0ca3deee0a30ad53f57337d0fe95dc7854d9c6bdb"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.133788 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.137699 4823 patch_prober.go:28] interesting pod/console-operator-58897d9998-vnw5c container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.137747 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" podUID="f6f3b211-3f2a-4f8e-8d2a-e68e8aaad955" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.143791 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rmzm5" event={"ID":"f28cb7aa-489c-4015-860c-9d925da5f135","Type":"ContainerStarted","Data":"adec861bbc3c7f02944ead65534a8e6618a1399e418e175c42416f5424a85eee"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.143833 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rmzm5" event={"ID":"f28cb7aa-489c-4015-860c-9d925da5f135","Type":"ContainerStarted","Data":"0709d30233dbbe38f98a9b5cd576afe93e9e0a171066d682f7ee3ccd2955a1bb"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.150925 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" event={"ID":"7933e531-8015-4c1a-ba84-fe22aa094da0","Type":"ContainerStarted","Data":"3095745e92d36025f29cc7ae22fecdf7f23e098228e89d02d4d5a3f4e4f2c85b"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.161340 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" event={"ID":"3266e10d-4bcd-4db0-aa15-53be39ecf437","Type":"ContainerStarted","Data":"21cb8944951e9475be5befa9f838524a3dd42f8a4a643c8a9c514760cc1b410f"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.174041 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" event={"ID":"dca532ee-e66a-411a-afcc-646f96a22a62","Type":"ContainerStarted","Data":"2d358bd6f6c0e8e78ad8d528e42077f33fdee2245475689910ade600668ec0c7"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.175080 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.178854 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.179404 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.67937608 +0000 UTC m=+141.167942203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.179714 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.186040 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.686011624 +0000 UTC m=+141.174577747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.186437 4823 generic.go:334] "Generic (PLEG): container finished" podID="ca0a37ba-9a04-4d90-8ee8-6797791303a4" containerID="1dca2cdbbcf648ec2ee0380ca0ab9d4bc720823ad8a8f0eed4a240c7e749e2b2" exitCode=0 Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.186499 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" event={"ID":"ca0a37ba-9a04-4d90-8ee8-6797791303a4","Type":"ContainerDied","Data":"1dca2cdbbcf648ec2ee0380ca0ab9d4bc720823ad8a8f0eed4a240c7e749e2b2"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.186530 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" event={"ID":"ca0a37ba-9a04-4d90-8ee8-6797791303a4","Type":"ContainerStarted","Data":"5b2d6bd1185da0350c5fc6bbaa0e616d020a190daff1af851f292dce9def1ca6"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.201829 4823 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-thj57 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.201904 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.220640 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" event={"ID":"e2376625-99df-4517-a64a-6c8b1d7edc20","Type":"ContainerStarted","Data":"ae4a6b333a602fcc15a17968a3ac7a2ace55000efb8d333848527c7c052d4231"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.239539 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" event={"ID":"05f64e2f-791a-463e-8ec4-340732e212ee","Type":"ContainerStarted","Data":"f4583ccea4386601f148f3953f02fe772ea1a947a810ad32f1cbaded52017d5b"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.262296 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" event={"ID":"58900f48-68af-4712-bb30-9d832c26ce02","Type":"ContainerStarted","Data":"55d53dfe0eedd5bebe4d5b5e3c800f871f9f9ff0c4cb2ea45f72c22d6db53fc9"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.281768 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.282754 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.782721089 +0000 UTC m=+141.271287272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.296035 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" event={"ID":"2dea4f36-2ae5-4363-a65c-0b7346f02661","Type":"ContainerStarted","Data":"233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.303780 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.314302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2glj6" event={"ID":"a1c5f775-cbe4-44d3-8ef3-19dd69e27b47","Type":"ContainerStarted","Data":"4f9e2eeb8b11fd96dd969b864b8ea94559990ffaed3276e4ad9c588e4dbe6a7a"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.333874 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" event={"ID":"553e9a81-ff35-4dd6-bbb7-d55749d88e45","Type":"ContainerStarted","Data":"91e5d68c94f5354d904f0378c214f3f88ada2c93598821e0547f2bfc66813f47"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.383270 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.386554 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:42.886534855 +0000 UTC m=+141.375100978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.411556 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c69wg" event={"ID":"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77","Type":"ContainerStarted","Data":"d799e3d3996fb36c76a86d31a24b6e6aa7a8ee3ea25c1e972e123015e2282b0e"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.430303 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:42 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:42 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:42 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.430370 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.439481 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" event={"ID":"b6b1d27b-235a-4b1e-adaa-512f3ae25954","Type":"ContainerStarted","Data":"4a8ec26b08c3b8c285f3f70aa106e1f078903dbf59cafd7a4233de40cde93cfd"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.478523 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dmbvr" event={"ID":"142b25e7-9ad6-4a22-8f1c-8bd280329db9","Type":"ContainerStarted","Data":"c9653b1b5b580871060a113b549ee8d63b5a83f57deb86da24b8e83b72c68df3"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.500906 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" event={"ID":"593f793a-bb15-4f83-8454-e3a1ced41667","Type":"ContainerStarted","Data":"f56b3277d15b8e510be8cd9c311e1e0a89fe3df1d99a30d357c2d574067bc784"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.500991 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" event={"ID":"593f793a-bb15-4f83-8454-e3a1ced41667","Type":"ContainerStarted","Data":"8a8593f9c3bc8e0468e770761fa602df1afba579eafc442f99556adbe11f5465"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.513185 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" event={"ID":"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a","Type":"ContainerStarted","Data":"621ff57d2739d946afa27d2af727af169aa0ae5241bc1702ce11a0d5874bede1"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.516021 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.524133 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.024097699 +0000 UTC m=+141.512663822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.524433 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.530507 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.030477085 +0000 UTC m=+141.519043208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.553078 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l9qfp" event={"ID":"ed1826b1-19af-4d50-b293-5c22c03fbfe7","Type":"ContainerStarted","Data":"9bc64d086ae6f1cabef47bab47f65682b87176657678a27f621d0b75431e4432"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.594832 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" event={"ID":"9a66a086-81cf-498e-aada-d06b41019b1f","Type":"ContainerStarted","Data":"388ddb91c24a819e0c7c33d3a8a462e975f4ac20c5c6a16bc19eadc262974218"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.607251 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" event={"ID":"9125b2ca-3f9a-47d3-8422-cae6f85f36d1","Type":"ContainerStarted","Data":"269b8c8dc4f599e5631562b7a1d0edfad0a59f94c9d3c6fc8406adc24871f995"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.627394 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.631080 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.131051401 +0000 UTC m=+141.619617524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.716393 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bh4xp" podStartSLOduration=123.716368381 podStartE2EDuration="2m3.716368381s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:42.713882757 +0000 UTC m=+141.202448880" watchObservedRunningTime="2025-12-16 06:57:42.716368381 +0000 UTC m=+141.204934514" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.718257 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" event={"ID":"410cd31d-8e32-4101-b933-2c7bd673c17f","Type":"ContainerStarted","Data":"3b2512ed5acebc5da8813bc99d0bcd076fba97bb71448c38bf3cb571c667d867"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.718326 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" event={"ID":"410cd31d-8e32-4101-b933-2c7bd673c17f","Type":"ContainerStarted","Data":"03da9df2bb802f40a865dc989ccc37444481eb7b69de99534ae7240ad5640625"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.746131 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.746900 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.246880461 +0000 UTC m=+141.735446584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.778418 4823 generic.go:334] "Generic (PLEG): container finished" podID="b7af89fb-e572-4c0d-a269-a65d03ac6e0e" containerID="168c83434f245cd6d553a8ee128c8eb540ad4f2961bf72c40aef4158414298e6" exitCode=0 Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.778798 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" event={"ID":"b7af89fb-e572-4c0d-a269-a65d03ac6e0e","Type":"ContainerDied","Data":"168c83434f245cd6d553a8ee128c8eb540ad4f2961bf72c40aef4158414298e6"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.817608 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" event={"ID":"7c559cce-7066-4c52-ad84-9c748243f010","Type":"ContainerStarted","Data":"32d4a341770a720bddcaac2ac591f27a852aa506a5bb0866710433ae92c3219e"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.819187 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.820063 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" podStartSLOduration=123.82000199 podStartE2EDuration="2m3.82000199s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:42.817368841 +0000 UTC m=+141.305934964" watchObservedRunningTime="2025-12-16 06:57:42.82000199 +0000 UTC m=+141.308568113" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.820971 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" podStartSLOduration=123.820963993 podStartE2EDuration="2m3.820963993s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:42.779267645 +0000 UTC m=+141.267833788" watchObservedRunningTime="2025-12-16 06:57:42.820963993 +0000 UTC m=+141.309530116" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.834194 4823 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-llvdl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.834929 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" podUID="7c559cce-7066-4c52-ad84-9c748243f010" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.847195 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.847522 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.347465397 +0000 UTC m=+141.836031520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.847688 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.848356 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.348333707 +0000 UTC m=+141.836899830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.849279 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" event={"ID":"37c33a89-18c7-457a-a8ce-85c7721719fc","Type":"ContainerStarted","Data":"1d7359850860dda69ac38cc8d7af7816c57a916ad3394fc71cb20473104cbcc2"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.849335 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" event={"ID":"37c33a89-18c7-457a-a8ce-85c7721719fc","Type":"ContainerStarted","Data":"4679c617a3f8c64d582c663e409b867d38463a9a18fc4be1483afa615dbf06ed"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.876536 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lpkx6" event={"ID":"843619ff-c0f4-4389-b3ef-62c282b20303","Type":"ContainerStarted","Data":"0c6b0615ed937c72bcd0cfd9e776296040c8cb453fcebcbb06c7e0dbb17a9563"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.904531 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" podStartSLOduration=122.904511444 podStartE2EDuration="2m2.904511444s" podCreationTimestamp="2025-12-16 06:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:42.860900902 +0000 UTC m=+141.349467025" watchObservedRunningTime="2025-12-16 06:57:42.904511444 +0000 UTC m=+141.393077557" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.906589 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c69wg" podStartSLOduration=8.906579003000001 podStartE2EDuration="8.906579003s" podCreationTimestamp="2025-12-16 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:42.903293692 +0000 UTC m=+141.391859805" watchObservedRunningTime="2025-12-16 06:57:42.906579003 +0000 UTC m=+141.395145126" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.907435 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rtddw" event={"ID":"dc6756f9-a794-4667-8143-bd14bedd0cc3","Type":"ContainerStarted","Data":"469a7daf61efab16a22fd899c4022f4764d17a8027ad0da734746fa542754494"} Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.918464 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.918638 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k2ljf" podUID="a4d1fadc-5068-4443-8ba2-9bbd80233db2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.951531 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:42 crc kubenswrapper[4823]: E1216 06:57:42.952337 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.452313818 +0000 UTC m=+141.940879941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:42 crc kubenswrapper[4823]: I1216 06:57:42.953643 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" podStartSLOduration=123.953622271 podStartE2EDuration="2m3.953622271s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:42.951622244 +0000 UTC m=+141.440188397" watchObservedRunningTime="2025-12-16 06:57:42.953622271 +0000 UTC m=+141.442188394" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.055175 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.059290 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.55927431 +0000 UTC m=+142.047840433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.067161 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" podStartSLOduration=123.062997785 podStartE2EDuration="2m3.062997785s" podCreationTimestamp="2025-12-16 06:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.062890561 +0000 UTC m=+141.551456684" watchObservedRunningTime="2025-12-16 06:57:43.062997785 +0000 UTC m=+141.551563908" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.086999 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" podStartSLOduration=124.086971165 podStartE2EDuration="2m4.086971165s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.030454426 +0000 UTC m=+141.519020579" watchObservedRunningTime="2025-12-16 06:57:43.086971165 +0000 UTC m=+141.575537288" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.156839 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.157343 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.65732207 +0000 UTC m=+142.145888193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.166315 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" podStartSLOduration=123.166290282 podStartE2EDuration="2m3.166290282s" podCreationTimestamp="2025-12-16 06:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.163107786 +0000 UTC m=+141.651673909" watchObservedRunningTime="2025-12-16 06:57:43.166290282 +0000 UTC m=+141.654856405" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.199872 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dpsl6" podStartSLOduration=124.199846605 podStartE2EDuration="2m4.199846605s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.19878539 +0000 UTC m=+141.687351513" watchObservedRunningTime="2025-12-16 06:57:43.199846605 +0000 UTC m=+141.688412728" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.262762 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.263622 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.763603669 +0000 UTC m=+142.252169792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.369755 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.370271 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.870242169 +0000 UTC m=+142.358808282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.370917 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.386099 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" podStartSLOduration=124.386075553 podStartE2EDuration="2m4.386075553s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.320739167 +0000 UTC m=+141.809305290" watchObservedRunningTime="2025-12-16 06:57:43.386075553 +0000 UTC m=+141.874641676" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.387564 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.887531222 +0000 UTC m=+142.376097345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.412911 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bx552" podStartSLOduration=124.412880208 podStartE2EDuration="2m4.412880208s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.38270732 +0000 UTC m=+141.871273443" watchObservedRunningTime="2025-12-16 06:57:43.412880208 +0000 UTC m=+141.901446321" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.443494 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:43 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:43 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:43 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.443597 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.449388 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" podStartSLOduration=124.44937413 podStartE2EDuration="2m4.44937413s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.447807388 +0000 UTC m=+141.936373511" watchObservedRunningTime="2025-12-16 06:57:43.44937413 +0000 UTC m=+141.937940253" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.479798 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.480196 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:43.98017944 +0000 UTC m=+142.468745563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.496224 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" podStartSLOduration=124.496203392 podStartE2EDuration="2m4.496203392s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.495090914 +0000 UTC m=+141.983657037" watchObservedRunningTime="2025-12-16 06:57:43.496203392 +0000 UTC m=+141.984769515" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.581918 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.582257 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.082244757 +0000 UTC m=+142.570810880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.593524 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c4c5h" podStartSLOduration=124.593507058 podStartE2EDuration="2m4.593507058s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.548699334 +0000 UTC m=+142.037265467" watchObservedRunningTime="2025-12-16 06:57:43.593507058 +0000 UTC m=+142.082073171" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.682894 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.683865 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.183848898 +0000 UTC m=+142.672415011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.785755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.786270 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.286245855 +0000 UTC m=+142.774811978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.886390 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.886619 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.386579843 +0000 UTC m=+142.875145966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.886755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.887531 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.387523435 +0000 UTC m=+142.876089558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.922144 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" event={"ID":"ab7ea4d1-0209-45ec-a40f-3a7bb6bc20ac","Type":"ContainerStarted","Data":"9e98f3d34c2326733aaa57c33e898a54e56bb6bab14920fd021f71a407f57586"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.922755 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.926520 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" event={"ID":"b7af89fb-e572-4c0d-a269-a65d03ac6e0e","Type":"ContainerStarted","Data":"e9a1fc2752215289248a74a2a9b3f53caf76273e5ec8279e61741fe0042ee0b1"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.926582 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" event={"ID":"b7af89fb-e572-4c0d-a269-a65d03ac6e0e","Type":"ContainerStarted","Data":"d11176c9b8a3c78b1686d3838ff43d3577d217354b17cf041a1da1b942bc1f74"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.928741 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" event={"ID":"60b58907-b6e9-4a6d-b442-9c79d839bac9","Type":"ContainerStarted","Data":"2111ccddf541caab762abd8a48bce78742d90876655f80bec411b32630271e9e"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.928965 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.930522 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" event={"ID":"58900f48-68af-4712-bb30-9d832c26ce02","Type":"ContainerStarted","Data":"77b39e513aa9d970142bb365c52d850c06a995f00284998503f2594d4d033d27"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.932434 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" event={"ID":"7c559cce-7066-4c52-ad84-9c748243f010","Type":"ContainerStarted","Data":"c3e1f3389ee48407861c98e86eb7aacc3759772103004d0c47cdd63245961f33"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.938406 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c69wg" event={"ID":"7a5ef820-b734-4f0a-92f9-ffbfb5fd0d77","Type":"ContainerStarted","Data":"9b1486ae91515fff77ca294d2b78a2c970f6708c60a2c2a44e1b87144e05b53c"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.947041 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" event={"ID":"b6b1d27b-235a-4b1e-adaa-512f3ae25954","Type":"ContainerStarted","Data":"0f59ba2538eb734d4a4b11eddc447e57ec3828b2088c0ac2a4cc3536f3e69b67"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.950304 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bx552" event={"ID":"e1c6d0f7-5a86-49fb-870d-991796812348","Type":"ContainerStarted","Data":"f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.954581 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxqpl" podStartSLOduration=124.954565658 podStartE2EDuration="2m4.954565658s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.595858527 +0000 UTC m=+142.084424650" watchObservedRunningTime="2025-12-16 06:57:43.954565658 +0000 UTC m=+142.443131781" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.955903 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" podStartSLOduration=124.955898703 podStartE2EDuration="2m4.955898703s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:43.951375951 +0000 UTC m=+142.439942074" watchObservedRunningTime="2025-12-16 06:57:43.955898703 +0000 UTC m=+142.444464826" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.962036 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" event={"ID":"f4a7b98f-600d-49af-9917-29b38e8877a4","Type":"ContainerStarted","Data":"a774844babed0a2ce6d022ecdeba251d43a7ba65a6b1219b39051b1252e1a087"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.962127 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.965540 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmp8w" event={"ID":"10d2c2db-5e69-4c14-92be-c6c7f6c9b04a","Type":"ContainerStarted","Data":"149455378f20aa7dfb2000bfd3a37126f532ca79ac31cce09d3adf2bbe1a0629"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.969200 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rmzm5" event={"ID":"f28cb7aa-489c-4015-860c-9d925da5f135","Type":"ContainerStarted","Data":"fd611262092f3985b55e52554da1a4ae33b35cbd756aa3b42ab638610e55f3b4"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.969952 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.974009 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5dtcg" Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.976034 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" event={"ID":"7933e531-8015-4c1a-ba84-fe22aa094da0","Type":"ContainerStarted","Data":"15b74dbf8c9a0fb6dca4821229748a615ae7df717b2fb4317bb7ed9ea1d4c77e"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.976108 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" event={"ID":"7933e531-8015-4c1a-ba84-fe22aa094da0","Type":"ContainerStarted","Data":"a2ac598dc4bed81c283fe3f6a7d556f2457c9721ae165165e2ba93b9dabd43ce"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.979655 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" event={"ID":"3266e10d-4bcd-4db0-aa15-53be39ecf437","Type":"ContainerStarted","Data":"de3e33adb594d138027417a73f1aff7fb13d5d40dbde76e0a3b54e0ed3203354"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.983843 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mgbxj" event={"ID":"e2376625-99df-4517-a64a-6c8b1d7edc20","Type":"ContainerStarted","Data":"2a487c3e1cd594ba8c95f1f46833e5c5883592c722bee99b7e55df82f3782fd9"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.988210 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qqqjd" event={"ID":"05f64e2f-791a-463e-8ec4-340732e212ee","Type":"ContainerStarted","Data":"06b5a691a524d8c930c28b61b4acb863e1649be232337907c5daef827cb2b843"} Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.988353 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.988500 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.488471343 +0000 UTC m=+142.977037466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.989260 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:43 crc kubenswrapper[4823]: E1216 06:57:43.990519 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.490502352 +0000 UTC m=+142.979068475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:43 crc kubenswrapper[4823]: I1216 06:57:43.998695 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" event={"ID":"eff38aa7-d0b3-455b-b0ca-1034fc06a182","Type":"ContainerStarted","Data":"4ad426ce520a0b388bf004948f1ad32b9024bb0efe7a1cd8e3017434f79c0dde"} Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.003622 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-llvdl" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.017203 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xpl2" event={"ID":"9125b2ca-3f9a-47d3-8422-cae6f85f36d1","Type":"ContainerStarted","Data":"781a4b67bf77786ca477247b4a36d320ffcc5e6cb5ec936ee89764666333c7a2"} Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.035724 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" event={"ID":"553e9a81-ff35-4dd6-bbb7-d55749d88e45","Type":"ContainerStarted","Data":"d4510acf0b63123580af6f3e81af39e025f50cf53f732b1d314eb93b397c508a"} Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.035787 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" event={"ID":"553e9a81-ff35-4dd6-bbb7-d55749d88e45","Type":"ContainerStarted","Data":"3901887023adbb3c42cfff3a45d7d153720e0ba1bce46236b1dfb82f83287706"} Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.038408 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" event={"ID":"979cabf2-8f74-4c88-92e8-baaffd74d816","Type":"ContainerStarted","Data":"4ee3ef1fbd23a8dc5905a4586999d4fe5b24a15917da1fe504d22f7a4a388e9f"} Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.038799 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.048280 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bdlsz" event={"ID":"9a66a086-81cf-498e-aada-d06b41019b1f","Type":"ContainerStarted","Data":"a0dc1cfcbacf29abdb49fffb0e6a5258b380967c8f9b247b8517cc81d072f775"} Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.049474 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.049649 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k2ljf" podUID="a4d1fadc-5068-4443-8ba2-9bbd80233db2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.056465 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.061720 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" podStartSLOduration=125.061688156 podStartE2EDuration="2m5.061688156s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.003957716 +0000 UTC m=+142.492523839" watchObservedRunningTime="2025-12-16 06:57:44.061688156 +0000 UTC m=+142.550254279" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.063401 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" podStartSLOduration=125.063396173 podStartE2EDuration="2m5.063396173s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.061264851 +0000 UTC m=+142.549830994" watchObservedRunningTime="2025-12-16 06:57:44.063396173 +0000 UTC m=+142.551962296" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.067477 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vnw5c" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.091242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.093103 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4wgv6" podStartSLOduration=125.093080455 podStartE2EDuration="2m5.093080455s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.091213292 +0000 UTC m=+142.579779415" watchObservedRunningTime="2025-12-16 06:57:44.093080455 +0000 UTC m=+142.581646578" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.093190 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.593141677 +0000 UTC m=+143.081707800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.189528 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9lh5d" podStartSLOduration=125.189502561 podStartE2EDuration="2m5.189502561s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.189093837 +0000 UTC m=+142.677659960" watchObservedRunningTime="2025-12-16 06:57:44.189502561 +0000 UTC m=+142.678068684" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.197959 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.198502 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.698475964 +0000 UTC m=+143.187042087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.213112 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" podStartSLOduration=124.213066487 podStartE2EDuration="2m4.213066487s" podCreationTimestamp="2025-12-16 06:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.212391764 +0000 UTC m=+142.700957907" watchObservedRunningTime="2025-12-16 06:57:44.213066487 +0000 UTC m=+142.701632620" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.230000 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4ltp" podStartSLOduration=125.229970608 podStartE2EDuration="2m5.229970608s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.229827423 +0000 UTC m=+142.718393556" watchObservedRunningTime="2025-12-16 06:57:44.229970608 +0000 UTC m=+142.718536731" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.302672 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.303075 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.802971272 +0000 UTC m=+143.291537395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.303326 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.304317 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.804306147 +0000 UTC m=+143.292872280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.310142 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkdq6" podStartSLOduration=125.310121444 podStartE2EDuration="2m5.310121444s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.307859267 +0000 UTC m=+142.796425390" watchObservedRunningTime="2025-12-16 06:57:44.310121444 +0000 UTC m=+142.798687567" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.312147 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" podStartSLOduration=125.312138891 podStartE2EDuration="2m5.312138891s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.288857725 +0000 UTC m=+142.777423848" watchObservedRunningTime="2025-12-16 06:57:44.312138891 +0000 UTC m=+142.800705014" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.405147 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.405388 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.905340539 +0000 UTC m=+143.393906662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.405712 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.406351 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:44.906338063 +0000 UTC m=+143.394904186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.407575 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:44 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:44 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:44 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.407630 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.424321 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rmzm5" podStartSLOduration=10.424293128 podStartE2EDuration="10.424293128s" podCreationTimestamp="2025-12-16 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:44.380378736 +0000 UTC m=+142.868944859" watchObservedRunningTime="2025-12-16 06:57:44.424293128 +0000 UTC m=+142.912859241" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.507356 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.507809 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.007784637 +0000 UTC m=+143.496350760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.608891 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.609704 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.109682768 +0000 UTC m=+143.598248891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.709913 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.710041 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.210007445 +0000 UTC m=+143.698573568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.710351 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.710856 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.210830874 +0000 UTC m=+143.699396997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.808117 4823 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.811491 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.811757 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.31171353 +0000 UTC m=+143.800279643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.812090 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.812554 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.312531397 +0000 UTC m=+143.801097530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.913400 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.913624 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.413568558 +0000 UTC m=+143.902134751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.914144 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:44 crc kubenswrapper[4823]: E1216 06:57:44.914643 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:57:45.414632625 +0000 UTC m=+143.903198928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmx2d" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.929512 4823 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6xfbm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.929605 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" podUID="60b58907-b6e9-4a6d-b442-9c79d839bac9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.962539 4823 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d9xfk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:57:44 crc kubenswrapper[4823]: I1216 06:57:44.962654 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" podUID="f4a7b98f-600d-49af-9917-29b38e8877a4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.000009 4823 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-16T06:57:44.80816551Z","Handler":null,"Name":""} Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.003881 4823 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.003918 4823 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.015269 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.068084 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.069855 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" event={"ID":"58900f48-68af-4712-bb30-9d832c26ce02","Type":"ContainerStarted","Data":"5c3508837a6c60a28970895961d14d3e7c4c57015c85c46e3010b5ce3f176ee0"} Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.081690 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jsk55" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.102409 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d9xfk" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.117534 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.238726 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.238783 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.391515 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmx2d\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.407378 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shg64"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.408715 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.415568 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.415933 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:45 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:45 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:45 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.415980 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.424443 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shg64"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.431248 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.533472 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fg8h\" (UniqueName: \"kubernetes.io/projected/726c1e86-0af2-45b0-bc89-af72df38eff8-kube-api-access-8fg8h\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.533555 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-utilities\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.533586 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-catalog-content\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.566937 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhp8l"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.568140 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.573431 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.597458 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.614567 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhp8l"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.639211 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fg8h\" (UniqueName: \"kubernetes.io/projected/726c1e86-0af2-45b0-bc89-af72df38eff8-kube-api-access-8fg8h\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.639278 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-utilities\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.639311 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-catalog-content\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.640259 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-utilities\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.640265 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-catalog-content\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.663014 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fg8h\" (UniqueName: \"kubernetes.io/projected/726c1e86-0af2-45b0-bc89-af72df38eff8-kube-api-access-8fg8h\") pod \"certified-operators-shg64\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.739386 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.741837 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ppqn\" (UniqueName: \"kubernetes.io/projected/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-kube-api-access-6ppqn\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.741907 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-catalog-content\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.741945 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-utilities\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.766597 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c98pm"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.767973 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.801057 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.801772 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c98pm"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.845472 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-utilities\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.845525 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-utilities\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.845550 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jkw\" (UniqueName: \"kubernetes.io/projected/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-kube-api-access-75jkw\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.845606 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ppqn\" (UniqueName: \"kubernetes.io/projected/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-kube-api-access-6ppqn\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.845640 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-catalog-content\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.845666 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-catalog-content\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.846177 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-catalog-content\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.846396 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-utilities\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.876728 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ppqn\" (UniqueName: \"kubernetes.io/projected/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-kube-api-access-6ppqn\") pod \"community-operators-hhp8l\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.881189 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.947071 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-catalog-content\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.947549 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-utilities\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.947576 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75jkw\" (UniqueName: \"kubernetes.io/projected/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-kube-api-access-75jkw\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.948497 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-catalog-content\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.948630 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-utilities\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.948877 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.949688 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.958616 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.958641 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.987733 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zgrf"] Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.988234 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jkw\" (UniqueName: \"kubernetes.io/projected/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-kube-api-access-75jkw\") pod \"certified-operators-c98pm\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:45 crc kubenswrapper[4823]: I1216 06:57:45.992683 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.052134 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.106302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" event={"ID":"58900f48-68af-4712-bb30-9d832c26ce02","Type":"ContainerStarted","Data":"17685cbeaded111c5a9cd206c3880057eca8a69dd3c7efb8627184eb3697f31e"} Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.106354 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" event={"ID":"58900f48-68af-4712-bb30-9d832c26ce02","Type":"ContainerStarted","Data":"4d72ec5b07ccfc59cf6234e88f39e888a8b85a5a36079b9c910b39f173affb85"} Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.146984 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151097 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndrd\" (UniqueName: \"kubernetes.io/projected/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-kube-api-access-jndrd\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151766 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-utilities\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151802 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-catalog-content\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151888 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151928 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151611 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4bxzg" podStartSLOduration=12.151571319 podStartE2EDuration="12.151571319s" podCreationTimestamp="2025-12-16 06:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:46.136117948 +0000 UTC m=+144.624684081" watchObservedRunningTime="2025-12-16 06:57:46.151571319 +0000 UTC m=+144.640137452" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.151697 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zgrf"] Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.265309 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.265497 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.265654 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndrd\" (UniqueName: \"kubernetes.io/projected/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-kube-api-access-jndrd\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.265705 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-utilities\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.265728 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-catalog-content\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.269738 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-catalog-content\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.269805 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-utilities\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.270981 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.318189 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.320587 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndrd\" (UniqueName: \"kubernetes.io/projected/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-kube-api-access-jndrd\") pod \"community-operators-5zgrf\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.336748 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.365423 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.413350 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.414005 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:46 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:46 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:46 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.414082 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.414496 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.441386 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.460296 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.470796 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.501549 4823 patch_prober.go:28] interesting pod/apiserver-76f77b778f-s6mwx container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]log ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]etcd ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/generic-apiserver-start-informers ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/max-in-flight-filter ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 16 06:57:46 crc kubenswrapper[4823]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 16 06:57:46 crc kubenswrapper[4823]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/project.openshift.io-projectcache ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/openshift.io-startinformers ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 16 06:57:46 crc kubenswrapper[4823]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 16 06:57:46 crc kubenswrapper[4823]: livez check failed Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.501625 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" podUID="b7af89fb-e572-4c0d-a269-a65d03ac6e0e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.546578 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shg64"] Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.548242 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmx2d"] Dec 16 06:57:46 crc kubenswrapper[4823]: W1216 06:57:46.577008 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod726c1e86_0af2_45b0_bc89_af72df38eff8.slice/crio-72e74b472ba819377be581c2983f0ab4ebb75afeec2d7810f57b32c397c3e2d4 WatchSource:0}: Error finding container 72e74b472ba819377be581c2983f0ab4ebb75afeec2d7810f57b32c397c3e2d4: Status 404 returned error can't find the container with id 72e74b472ba819377be581c2983f0ab4ebb75afeec2d7810f57b32c397c3e2d4 Dec 16 06:57:46 crc kubenswrapper[4823]: W1216 06:57:46.577533 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a48b03b_402f_48b1_a3b7_52690850de42.slice/crio-ccfba756a351719dde65ce1613b1a618a93494ddad1f764acf048b5f5aafbe29 WatchSource:0}: Error finding container ccfba756a351719dde65ce1613b1a618a93494ddad1f764acf048b5f5aafbe29: Status 404 returned error can't find the container with id ccfba756a351719dde65ce1613b1a618a93494ddad1f764acf048b5f5aafbe29 Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.675181 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.675240 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k2ljf" podUID="a4d1fadc-5068-4443-8ba2-9bbd80233db2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.675523 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.675542 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k2ljf" podUID="a4d1fadc-5068-4443-8ba2-9bbd80233db2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.817475 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c98pm"] Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.848017 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zgrf"] Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.877996 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhp8l"] Dec 16 06:57:46 crc kubenswrapper[4823]: W1216 06:57:46.943402 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8470ac_325b_46ce_ac5d_6cbafc3c6164.slice/crio-4bc2d23b3e0977eadcd44ac5cc6ebbbd6914d1a2aa8b8fe248afc384ef3a8998 WatchSource:0}: Error finding container 4bc2d23b3e0977eadcd44ac5cc6ebbbd6914d1a2aa8b8fe248afc384ef3a8998: Status 404 returned error can't find the container with id 4bc2d23b3e0977eadcd44ac5cc6ebbbd6914d1a2aa8b8fe248afc384ef3a8998 Dec 16 06:57:46 crc kubenswrapper[4823]: W1216 06:57:46.945450 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6b042f_7b3a_4204_90a8_d6a2c29fd271.slice/crio-6470b7a85105c21d08d83f55a8f594c5024f0886ea1ebda8fa08330d9a52a595 WatchSource:0}: Error finding container 6470b7a85105c21d08d83f55a8f594c5024f0886ea1ebda8fa08330d9a52a595: Status 404 returned error can't find the container with id 6470b7a85105c21d08d83f55a8f594c5024f0886ea1ebda8fa08330d9a52a595 Dec 16 06:57:46 crc kubenswrapper[4823]: I1216 06:57:46.960543 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 06:57:47 crc kubenswrapper[4823]: W1216 06:57:47.035392 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfd73cf97_a298_4436_8568_ea2cbd08c6f6.slice/crio-051c1ddb7fdaf2c6a279a13e6b21834a6f11bbee32422b2bf1fa9f5d449b3d46 WatchSource:0}: Error finding container 051c1ddb7fdaf2c6a279a13e6b21834a6f11bbee32422b2bf1fa9f5d449b3d46: Status 404 returned error can't find the container with id 051c1ddb7fdaf2c6a279a13e6b21834a6f11bbee32422b2bf1fa9f5d449b3d46 Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.072511 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.072872 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.074392 4823 patch_prober.go:28] interesting pod/console-f9d7485db-bx552 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.074495 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bx552" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" containerName="console" probeResult="failure" output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.114090 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fd73cf97-a298-4436-8568-ea2cbd08c6f6","Type":"ContainerStarted","Data":"051c1ddb7fdaf2c6a279a13e6b21834a6f11bbee32422b2bf1fa9f5d449b3d46"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.121962 4823 generic.go:334] "Generic (PLEG): container finished" podID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerID="3ec50673f7e7ef8e7eae49643e2ec546d3ef9485b6c7c61987a52e803118a429" exitCode=0 Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.122068 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shg64" event={"ID":"726c1e86-0af2-45b0-bc89-af72df38eff8","Type":"ContainerDied","Data":"3ec50673f7e7ef8e7eae49643e2ec546d3ef9485b6c7c61987a52e803118a429"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.122101 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shg64" event={"ID":"726c1e86-0af2-45b0-bc89-af72df38eff8","Type":"ContainerStarted","Data":"72e74b472ba819377be581c2983f0ab4ebb75afeec2d7810f57b32c397c3e2d4"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.123985 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.133818 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerStarted","Data":"6470b7a85105c21d08d83f55a8f594c5024f0886ea1ebda8fa08330d9a52a595"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.135814 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerStarted","Data":"4bc2d23b3e0977eadcd44ac5cc6ebbbd6914d1a2aa8b8fe248afc384ef3a8998"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.145440 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerStarted","Data":"32ee30f88508d04fa1a8fb55016cf0250c2057be86416af7b71b89bb593021f0"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.155821 4823 generic.go:334] "Generic (PLEG): container finished" podID="b6b1d27b-235a-4b1e-adaa-512f3ae25954" containerID="0f59ba2538eb734d4a4b11eddc447e57ec3828b2088c0ac2a4cc3536f3e69b67" exitCode=0 Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.155881 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" event={"ID":"b6b1d27b-235a-4b1e-adaa-512f3ae25954","Type":"ContainerDied","Data":"0f59ba2538eb734d4a4b11eddc447e57ec3828b2088c0ac2a4cc3536f3e69b67"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.158959 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" event={"ID":"0a48b03b-402f-48b1-a3b7-52690850de42","Type":"ContainerStarted","Data":"83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.158992 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" event={"ID":"0a48b03b-402f-48b1-a3b7-52690850de42","Type":"ContainerStarted","Data":"ccfba756a351719dde65ce1613b1a618a93494ddad1f764acf048b5f5aafbe29"} Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.175425 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5rs6p" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.215418 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" podStartSLOduration=128.215397539 podStartE2EDuration="2m8.215397539s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:47.214574581 +0000 UTC m=+145.703140724" watchObservedRunningTime="2025-12-16 06:57:47.215397539 +0000 UTC m=+145.703963662" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.406204 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.421473 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:47 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:47 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:47 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.421555 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.552390 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5x67m"] Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.553954 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.557036 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.567176 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x67m"] Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.608964 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ggh\" (UniqueName: \"kubernetes.io/projected/7220fce6-80f1-4da7-9a90-f106616709ae-kube-api-access-64ggh\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.609044 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-utilities\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.609282 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-catalog-content\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710619 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-catalog-content\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710704 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710749 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710791 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710812 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64ggh\" (UniqueName: \"kubernetes.io/projected/7220fce6-80f1-4da7-9a90-f106616709ae-kube-api-access-64ggh\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710833 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-utilities\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.710852 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.712967 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-utilities\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.713231 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-catalog-content\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.713281 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.722348 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.728570 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.729099 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.738576 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ggh\" (UniqueName: \"kubernetes.io/projected/7220fce6-80f1-4da7-9a90-f106616709ae-kube-api-access-64ggh\") pod \"redhat-marketplace-5x67m\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.877012 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.885894 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.896213 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.903830 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.960501 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdctr"] Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.961649 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:47 crc kubenswrapper[4823]: I1216 06:57:47.974138 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdctr"] Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.021915 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-catalog-content\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.022490 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-utilities\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.022558 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrv2\" (UniqueName: \"kubernetes.io/projected/af944c3a-0424-490f-b445-b2ee72a3af0c-kube-api-access-qnrv2\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.124052 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-catalog-content\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.124125 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-utilities\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.124180 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrv2\" (UniqueName: \"kubernetes.io/projected/af944c3a-0424-490f-b445-b2ee72a3af0c-kube-api-access-qnrv2\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.125872 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-catalog-content\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.126188 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-utilities\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.148980 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrv2\" (UniqueName: \"kubernetes.io/projected/af944c3a-0424-490f-b445-b2ee72a3af0c-kube-api-access-qnrv2\") pod \"redhat-marketplace-vdctr\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.183382 4823 generic.go:334] "Generic (PLEG): container finished" podID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerID="71d023451ad9f232f51c8bea4b92138bd63ce3588c3f86c74475eb36b93a867d" exitCode=0 Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.183458 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerDied","Data":"71d023451ad9f232f51c8bea4b92138bd63ce3588c3f86c74475eb36b93a867d"} Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.188107 4823 generic.go:334] "Generic (PLEG): container finished" podID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerID="3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5" exitCode=0 Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.188185 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerDied","Data":"3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5"} Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.198674 4823 generic.go:334] "Generic (PLEG): container finished" podID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerID="2a74898e74c7e810093cab0f1d311901b1a1948664709995aa7195175bc61a8b" exitCode=0 Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.198773 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerDied","Data":"2a74898e74c7e810093cab0f1d311901b1a1948664709995aa7195175bc61a8b"} Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.204114 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fd73cf97-a298-4436-8568-ea2cbd08c6f6","Type":"ContainerStarted","Data":"324ae8baa96ae35ff968950d66a9cd130527eb45d418d8c1f0934763fd3a89f3"} Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.205159 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.229667 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x67m"] Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.236376 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.299123 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.302751 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.302703692 podStartE2EDuration="3.302703692s" podCreationTimestamp="2025-12-16 06:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:48.29910977 +0000 UTC m=+146.787675893" watchObservedRunningTime="2025-12-16 06:57:48.302703692 +0000 UTC m=+146.791269815" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.419631 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:48 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:48 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:48 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.419726 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.561913 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2jtdw"] Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.563682 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.567310 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.585330 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2jtdw"] Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.634321 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.673475 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mmc\" (UniqueName: \"kubernetes.io/projected/dfaef15c-ea70-4287-bf78-7e99f0fd0626-kube-api-access-b6mmc\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.673558 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-utilities\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.673606 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-catalog-content\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.752125 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdctr"] Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.774307 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kttpx\" (UniqueName: \"kubernetes.io/projected/b6b1d27b-235a-4b1e-adaa-512f3ae25954-kube-api-access-kttpx\") pod \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.774377 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b1d27b-235a-4b1e-adaa-512f3ae25954-secret-volume\") pod \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.774612 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b1d27b-235a-4b1e-adaa-512f3ae25954-config-volume\") pod \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\" (UID: \"b6b1d27b-235a-4b1e-adaa-512f3ae25954\") " Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.774867 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mmc\" (UniqueName: \"kubernetes.io/projected/dfaef15c-ea70-4287-bf78-7e99f0fd0626-kube-api-access-b6mmc\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.774903 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-utilities\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.774930 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-catalog-content\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.775505 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-catalog-content\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.778122 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b1d27b-235a-4b1e-adaa-512f3ae25954-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6b1d27b-235a-4b1e-adaa-512f3ae25954" (UID: "b6b1d27b-235a-4b1e-adaa-512f3ae25954"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.778638 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-utilities\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.785422 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b1d27b-235a-4b1e-adaa-512f3ae25954-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6b1d27b-235a-4b1e-adaa-512f3ae25954" (UID: "b6b1d27b-235a-4b1e-adaa-512f3ae25954"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.785522 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b1d27b-235a-4b1e-adaa-512f3ae25954-kube-api-access-kttpx" (OuterVolumeSpecName: "kube-api-access-kttpx") pod "b6b1d27b-235a-4b1e-adaa-512f3ae25954" (UID: "b6b1d27b-235a-4b1e-adaa-512f3ae25954"). InnerVolumeSpecName "kube-api-access-kttpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.798533 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mmc\" (UniqueName: \"kubernetes.io/projected/dfaef15c-ea70-4287-bf78-7e99f0fd0626-kube-api-access-b6mmc\") pod \"redhat-operators-2jtdw\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: W1216 06:57:48.799664 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf944c3a_0424_490f_b445_b2ee72a3af0c.slice/crio-d23649003adb8d2768086f8360b07f8324e52d201d0d9bcff7f95c1ea98f3ac3 WatchSource:0}: Error finding container d23649003adb8d2768086f8360b07f8324e52d201d0d9bcff7f95c1ea98f3ac3: Status 404 returned error can't find the container with id d23649003adb8d2768086f8360b07f8324e52d201d0d9bcff7f95c1ea98f3ac3 Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.876848 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b1d27b-235a-4b1e-adaa-512f3ae25954-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.877299 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kttpx\" (UniqueName: \"kubernetes.io/projected/b6b1d27b-235a-4b1e-adaa-512f3ae25954-kube-api-access-kttpx\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.877318 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b1d27b-235a-4b1e-adaa-512f3ae25954-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.890842 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.954902 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tls89"] Dec 16 06:57:48 crc kubenswrapper[4823]: E1216 06:57:48.955186 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b1d27b-235a-4b1e-adaa-512f3ae25954" containerName="collect-profiles" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.955203 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b1d27b-235a-4b1e-adaa-512f3ae25954" containerName="collect-profiles" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.955306 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b1d27b-235a-4b1e-adaa-512f3ae25954" containerName="collect-profiles" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.956153 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:48 crc kubenswrapper[4823]: I1216 06:57:48.967537 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tls89"] Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.081358 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9fs\" (UniqueName: \"kubernetes.io/projected/f658a770-db01-4cb4-8d83-e6dc10513860-kube-api-access-9n9fs\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.081465 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-utilities\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.081498 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-catalog-content\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.182937 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-utilities\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.183006 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-catalog-content\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.183071 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9fs\" (UniqueName: \"kubernetes.io/projected/f658a770-db01-4cb4-8d83-e6dc10513860-kube-api-access-9n9fs\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.186488 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-catalog-content\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.186526 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-utilities\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.217273 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9fs\" (UniqueName: \"kubernetes.io/projected/f658a770-db01-4cb4-8d83-e6dc10513860-kube-api-access-9n9fs\") pod \"redhat-operators-tls89\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.241933 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"11e4f43deeb8b14bf7c9b00dd467c367934dad0aae1981ca51456ccacd723e04"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.242464 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0075b76a2e9ffa9ab46a8bd974431cbcc2a1f52579d11195cf47ab96a5e1f02e"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.246723 4823 generic.go:334] "Generic (PLEG): container finished" podID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerID="cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25" exitCode=0 Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.247300 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerDied","Data":"cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.247372 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerStarted","Data":"d23649003adb8d2768086f8360b07f8324e52d201d0d9bcff7f95c1ea98f3ac3"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.255864 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" event={"ID":"b6b1d27b-235a-4b1e-adaa-512f3ae25954","Type":"ContainerDied","Data":"4a8ec26b08c3b8c285f3f70aa106e1f078903dbf59cafd7a4233de40cde93cfd"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.255910 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8ec26b08c3b8c285f3f70aa106e1f078903dbf59cafd7a4233de40cde93cfd" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.255920 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.266705 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d0bc6c131d0629ffab5b56159ca3fd44f82d56323b4d48883fd23cbcfdaec51a"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.266872 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7f89e747327bab2b71fe65ca9602f3953ea01f3e8fdeb7f5ae4b32749fa9deeb"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.270964 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.273067 4823 generic.go:334] "Generic (PLEG): container finished" podID="fd73cf97-a298-4436-8568-ea2cbd08c6f6" containerID="324ae8baa96ae35ff968950d66a9cd130527eb45d418d8c1f0934763fd3a89f3" exitCode=0 Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.273123 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fd73cf97-a298-4436-8568-ea2cbd08c6f6","Type":"ContainerDied","Data":"324ae8baa96ae35ff968950d66a9cd130527eb45d418d8c1f0934763fd3a89f3"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.278712 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8a2635650a0856d8dfde14f8caadc84bfead23a6b832a9ab879ffece77809f61"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.278827 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1b782484500d2e194f4ef01a81901de47801976d6878d1acf0565c3f93e6288d"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.290170 4823 generic.go:334] "Generic (PLEG): container finished" podID="7220fce6-80f1-4da7-9a90-f106616709ae" containerID="979af8dd52b3e813e13be8d2fb702c3658663c32c080557a3e993767ce88614d" exitCode=0 Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.292574 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x67m" event={"ID":"7220fce6-80f1-4da7-9a90-f106616709ae","Type":"ContainerDied","Data":"979af8dd52b3e813e13be8d2fb702c3658663c32c080557a3e993767ce88614d"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.292649 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x67m" event={"ID":"7220fce6-80f1-4da7-9a90-f106616709ae","Type":"ContainerStarted","Data":"c8f683aceaef60e0c777286f257a90f2218cd1d4782db8f98de8d422b6e377cf"} Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.297446 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.408494 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2jtdw"] Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.419128 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:49 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:49 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:49 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:49 crc kubenswrapper[4823]: I1216 06:57:49.419535 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.039546 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tls89"] Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.313390 4823 generic.go:334] "Generic (PLEG): container finished" podID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerID="71620843587c34b5bf2f91b1d6da411bf4275fedd8c4728f8e223f33d9f89d7b" exitCode=0 Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.314586 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerDied","Data":"71620843587c34b5bf2f91b1d6da411bf4275fedd8c4728f8e223f33d9f89d7b"} Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.314617 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerStarted","Data":"d568a9f1d174fc7d1f8cb49d0fe6abaafef83aa45f554f004b62d43bb5155cbd"} Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.325045 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerStarted","Data":"6836c95e4b4b6dd3dfe6800dc3876bf08648b3fbc717d4ba326de4e9e9877b6c"} Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.406360 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:50 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:50 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:50 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.406432 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.704134 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.845218 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kubelet-dir\") pod \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.845304 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kube-api-access\") pod \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\" (UID: \"fd73cf97-a298-4436-8568-ea2cbd08c6f6\") " Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.847003 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fd73cf97-a298-4436-8568-ea2cbd08c6f6" (UID: "fd73cf97-a298-4436-8568-ea2cbd08c6f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.858517 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fd73cf97-a298-4436-8568-ea2cbd08c6f6" (UID: "fd73cf97-a298-4436-8568-ea2cbd08c6f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.949309 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:50 crc kubenswrapper[4823]: I1216 06:57:50.949384 4823 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd73cf97-a298-4436-8568-ea2cbd08c6f6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.374836 4823 generic.go:334] "Generic (PLEG): container finished" podID="f658a770-db01-4cb4-8d83-e6dc10513860" containerID="05d0aa39fba1d1a8df245990a2dbde6985a8939f63c5225d847104d6171d7cec" exitCode=0 Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.374921 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerDied","Data":"05d0aa39fba1d1a8df245990a2dbde6985a8939f63c5225d847104d6171d7cec"} Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.386075 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fd73cf97-a298-4436-8568-ea2cbd08c6f6","Type":"ContainerDied","Data":"051c1ddb7fdaf2c6a279a13e6b21834a6f11bbee32422b2bf1fa9f5d449b3d46"} Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.386130 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.386151 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051c1ddb7fdaf2c6a279a13e6b21834a6f11bbee32422b2bf1fa9f5d449b3d46" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.406806 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:51 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:51 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:51 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.406882 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.467148 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.479634 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-s6mwx" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.708216 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 06:57:51 crc kubenswrapper[4823]: E1216 06:57:51.708768 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd73cf97-a298-4436-8568-ea2cbd08c6f6" containerName="pruner" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.708782 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd73cf97-a298-4436-8568-ea2cbd08c6f6" containerName="pruner" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.708892 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd73cf97-a298-4436-8568-ea2cbd08c6f6" containerName="pruner" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.713426 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.728453 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.729082 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.736249 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.767156 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1234e73-c354-413c-a717-f53ee37431d7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.767649 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1234e73-c354-413c-a717-f53ee37431d7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.869526 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1234e73-c354-413c-a717-f53ee37431d7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.869593 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1234e73-c354-413c-a717-f53ee37431d7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.869696 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1234e73-c354-413c-a717-f53ee37431d7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:51 crc kubenswrapper[4823]: I1216 06:57:51.887911 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1234e73-c354-413c-a717-f53ee37431d7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:52 crc kubenswrapper[4823]: I1216 06:57:52.059874 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:57:52 crc kubenswrapper[4823]: I1216 06:57:52.396326 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rmzm5" Dec 16 06:57:52 crc kubenswrapper[4823]: I1216 06:57:52.423321 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:52 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:52 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:52 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:52 crc kubenswrapper[4823]: I1216 06:57:52.423428 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:52 crc kubenswrapper[4823]: I1216 06:57:52.647655 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 06:57:53 crc kubenswrapper[4823]: I1216 06:57:53.405492 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:53 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:53 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:53 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:53 crc kubenswrapper[4823]: I1216 06:57:53.405960 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:53 crc kubenswrapper[4823]: I1216 06:57:53.443384 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1234e73-c354-413c-a717-f53ee37431d7","Type":"ContainerStarted","Data":"1ef2de70968eb69a0dc7ccc1c5d87e2505404da17bef05336b00e58316d3731d"} Dec 16 06:57:54 crc kubenswrapper[4823]: I1216 06:57:54.410654 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:54 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:54 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:54 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:54 crc kubenswrapper[4823]: I1216 06:57:54.410762 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:54 crc kubenswrapper[4823]: I1216 06:57:54.491272 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1234e73-c354-413c-a717-f53ee37431d7","Type":"ContainerStarted","Data":"c0a8f510e4f96bea1e8be1c59254820805e345b4b24aa159a5ad702769fc1481"} Dec 16 06:57:54 crc kubenswrapper[4823]: I1216 06:57:54.511344 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.511307872 podStartE2EDuration="3.511307872s" podCreationTimestamp="2025-12-16 06:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:54.508724595 +0000 UTC m=+152.997290718" watchObservedRunningTime="2025-12-16 06:57:54.511307872 +0000 UTC m=+152.999873995" Dec 16 06:57:55 crc kubenswrapper[4823]: I1216 06:57:55.405923 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:55 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:55 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:55 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:55 crc kubenswrapper[4823]: I1216 06:57:55.406457 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:55 crc kubenswrapper[4823]: I1216 06:57:55.525584 4823 generic.go:334] "Generic (PLEG): container finished" podID="b1234e73-c354-413c-a717-f53ee37431d7" containerID="c0a8f510e4f96bea1e8be1c59254820805e345b4b24aa159a5ad702769fc1481" exitCode=0 Dec 16 06:57:55 crc kubenswrapper[4823]: I1216 06:57:55.525650 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1234e73-c354-413c-a717-f53ee37431d7","Type":"ContainerDied","Data":"c0a8f510e4f96bea1e8be1c59254820805e345b4b24aa159a5ad702769fc1481"} Dec 16 06:57:56 crc kubenswrapper[4823]: I1216 06:57:56.406211 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:56 crc kubenswrapper[4823]: [-]has-synced failed: reason withheld Dec 16 06:57:56 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:56 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:56 crc kubenswrapper[4823]: I1216 06:57:56.406278 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:56 crc kubenswrapper[4823]: I1216 06:57:56.683459 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-k2ljf" Dec 16 06:57:57 crc kubenswrapper[4823]: I1216 06:57:57.072966 4823 patch_prober.go:28] interesting pod/console-f9d7485db-bx552 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Dec 16 06:57:57 crc kubenswrapper[4823]: I1216 06:57:57.073319 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bx552" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" containerName="console" probeResult="failure" output="Get \"https://10.217.0.42:8443/health\": dial tcp 10.217.0.42:8443: connect: connection refused" Dec 16 06:57:57 crc kubenswrapper[4823]: I1216 06:57:57.409069 4823 patch_prober.go:28] interesting pod/router-default-5444994796-lpkx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:57:57 crc kubenswrapper[4823]: [+]has-synced ok Dec 16 06:57:57 crc kubenswrapper[4823]: [+]process-running ok Dec 16 06:57:57 crc kubenswrapper[4823]: healthz check failed Dec 16 06:57:57 crc kubenswrapper[4823]: I1216 06:57:57.409143 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lpkx6" podUID="843619ff-c0f4-4389-b3ef-62c282b20303" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:57:58 crc kubenswrapper[4823]: I1216 06:57:58.134473 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:57:58 crc kubenswrapper[4823]: I1216 06:57:58.134543 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:57:58 crc kubenswrapper[4823]: I1216 06:57:58.406484 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:57:58 crc kubenswrapper[4823]: I1216 06:57:58.408938 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lpkx6" Dec 16 06:58:01 crc kubenswrapper[4823]: I1216 06:58:01.684092 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:58:01 crc kubenswrapper[4823]: I1216 06:58:01.695615 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e7dd738-a9b3-455c-93e0-3f0dc7327817-metrics-certs\") pod \"network-metrics-daemon-8mn7l\" (UID: \"1e7dd738-a9b3-455c-93e0-3f0dc7327817\") " pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:58:01 crc kubenswrapper[4823]: I1216 06:58:01.885090 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8mn7l" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.229770 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.351977 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1234e73-c354-413c-a717-f53ee37431d7-kube-api-access\") pod \"b1234e73-c354-413c-a717-f53ee37431d7\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.352358 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1234e73-c354-413c-a717-f53ee37431d7-kubelet-dir\") pod \"b1234e73-c354-413c-a717-f53ee37431d7\" (UID: \"b1234e73-c354-413c-a717-f53ee37431d7\") " Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.352525 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1234e73-c354-413c-a717-f53ee37431d7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b1234e73-c354-413c-a717-f53ee37431d7" (UID: "b1234e73-c354-413c-a717-f53ee37431d7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.353019 4823 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b1234e73-c354-413c-a717-f53ee37431d7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.358644 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1234e73-c354-413c-a717-f53ee37431d7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b1234e73-c354-413c-a717-f53ee37431d7" (UID: "b1234e73-c354-413c-a717-f53ee37431d7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.437324 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.456011 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1234e73-c354-413c-a717-f53ee37431d7-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.650201 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b1234e73-c354-413c-a717-f53ee37431d7","Type":"ContainerDied","Data":"1ef2de70968eb69a0dc7ccc1c5d87e2505404da17bef05336b00e58316d3731d"} Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.650892 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ef2de70968eb69a0dc7ccc1c5d87e2505404da17bef05336b00e58316d3731d" Dec 16 06:58:05 crc kubenswrapper[4823]: I1216 06:58:05.650300 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:58:07 crc kubenswrapper[4823]: I1216 06:58:07.442905 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:58:07 crc kubenswrapper[4823]: I1216 06:58:07.449391 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bx552" Dec 16 06:58:16 crc kubenswrapper[4823]: I1216 06:58:16.781711 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5dbcj" Dec 16 06:58:21 crc kubenswrapper[4823]: E1216 06:58:21.130192 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 06:58:21 crc kubenswrapper[4823]: E1216 06:58:21.130691 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75jkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c98pm_openshift-marketplace(28fa6f6b-657a-4fe7-993f-6c97d5e53b3c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:21 crc kubenswrapper[4823]: E1216 06:58:21.131958 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-c98pm" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" Dec 16 06:58:22 crc kubenswrapper[4823]: E1216 06:58:22.601141 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 06:58:22 crc kubenswrapper[4823]: E1216 06:58:22.601652 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jndrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5zgrf_openshift-marketplace(9e8470ac-325b-46ce-ac5d-6cbafc3c6164): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:22 crc kubenswrapper[4823]: E1216 06:58:22.602829 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5zgrf" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" Dec 16 06:58:22 crc kubenswrapper[4823]: E1216 06:58:22.676953 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 06:58:22 crc kubenswrapper[4823]: E1216 06:58:22.677134 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fg8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-shg64_openshift-marketplace(726c1e86-0af2-45b0-bc89-af72df38eff8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:22 crc kubenswrapper[4823]: E1216 06:58:22.678354 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-shg64" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.512824 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.514045 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1234e73-c354-413c-a717-f53ee37431d7" containerName="pruner" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.514070 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1234e73-c354-413c-a717-f53ee37431d7" containerName="pruner" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.514301 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1234e73-c354-413c-a717-f53ee37431d7" containerName="pruner" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.515137 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.519386 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.522896 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.523627 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.563154 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.563571 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.664059 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.664622 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.664238 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.687446 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.832526 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5zgrf" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.832526 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c98pm" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.832532 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-shg64" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" Dec 16 06:58:25 crc kubenswrapper[4823]: I1216 06:58:25.845841 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.909214 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.909503 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6mmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2jtdw_openshift-marketplace(dfaef15c-ea70-4287-bf78-7e99f0fd0626): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:25 crc kubenswrapper[4823]: E1216 06:58:25.911108 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2jtdw" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" Dec 16 06:58:26 crc kubenswrapper[4823]: E1216 06:58:26.919524 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2jtdw" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" Dec 16 06:58:26 crc kubenswrapper[4823]: E1216 06:58:26.995774 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 06:58:26 crc kubenswrapper[4823]: E1216 06:58:26.996531 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-64ggh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5x67m_openshift-marketplace(7220fce6-80f1-4da7-9a90-f106616709ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:26 crc kubenswrapper[4823]: E1216 06:58:26.998010 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5x67m" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.008843 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.009071 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ppqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hhp8l_openshift-marketplace(ca6b042f-7b3a-4204-90a8-d6a2c29fd271): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.010308 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hhp8l" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.026460 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.026633 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qnrv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vdctr_openshift-marketplace(af944c3a-0424-490f-b445-b2ee72a3af0c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.027867 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vdctr" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.064386 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.064726 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n9fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tls89_openshift-marketplace(f658a770-db01-4cb4-8d83-e6dc10513860): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.065943 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tls89" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.351041 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8mn7l"] Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.415828 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 06:58:27 crc kubenswrapper[4823]: W1216 06:58:27.424534 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod17c65b6f_7fb6_494a_9f3e_43a79690df3d.slice/crio-5179ab4d66e8b2aa3f0fb438d63858fa68a75290209b85bcf42be3aab0a79aa9 WatchSource:0}: Error finding container 5179ab4d66e8b2aa3f0fb438d63858fa68a75290209b85bcf42be3aab0a79aa9: Status 404 returned error can't find the container with id 5179ab4d66e8b2aa3f0fb438d63858fa68a75290209b85bcf42be3aab0a79aa9 Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.800014 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"17c65b6f-7fb6-494a-9f3e-43a79690df3d","Type":"ContainerStarted","Data":"91b04e3c1f03c7694d9d5ff01d74bf228312be756ab137c0369670277f4c2919"} Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.800493 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"17c65b6f-7fb6-494a-9f3e-43a79690df3d","Type":"ContainerStarted","Data":"5179ab4d66e8b2aa3f0fb438d63858fa68a75290209b85bcf42be3aab0a79aa9"} Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.804710 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" event={"ID":"1e7dd738-a9b3-455c-93e0-3f0dc7327817","Type":"ContainerStarted","Data":"3ab20bf836a7c1dd9f1bf4310012c32d8a2a994aaa9bfebc77a89d5ce53e9688"} Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.804763 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" event={"ID":"1e7dd738-a9b3-455c-93e0-3f0dc7327817","Type":"ContainerStarted","Data":"0d4c85bb9361a573243526f1ad50d40402a56dced3bb0135b69e6a2857e23c89"} Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.806997 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vdctr" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.807363 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hhp8l" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.807466 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tls89" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" Dec 16 06:58:27 crc kubenswrapper[4823]: E1216 06:58:27.807536 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5x67m" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.817882 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.817854235 podStartE2EDuration="2.817854235s" podCreationTimestamp="2025-12-16 06:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:58:27.81387634 +0000 UTC m=+186.302442473" watchObservedRunningTime="2025-12-16 06:58:27.817854235 +0000 UTC m=+186.306420358" Dec 16 06:58:27 crc kubenswrapper[4823]: I1216 06:58:27.902257 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:58:28 crc kubenswrapper[4823]: I1216 06:58:28.134282 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:58:28 crc kubenswrapper[4823]: I1216 06:58:28.134389 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:58:28 crc kubenswrapper[4823]: I1216 06:58:28.812624 4823 generic.go:334] "Generic (PLEG): container finished" podID="17c65b6f-7fb6-494a-9f3e-43a79690df3d" containerID="91b04e3c1f03c7694d9d5ff01d74bf228312be756ab137c0369670277f4c2919" exitCode=0 Dec 16 06:58:28 crc kubenswrapper[4823]: I1216 06:58:28.813108 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"17c65b6f-7fb6-494a-9f3e-43a79690df3d","Type":"ContainerDied","Data":"91b04e3c1f03c7694d9d5ff01d74bf228312be756ab137c0369670277f4c2919"} Dec 16 06:58:28 crc kubenswrapper[4823]: I1216 06:58:28.818240 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8mn7l" event={"ID":"1e7dd738-a9b3-455c-93e0-3f0dc7327817","Type":"ContainerStarted","Data":"1c22e4c01f9a8ffcd7410af42dc4c2135716b730cd6b4f0d5d150428291a261a"} Dec 16 06:58:28 crc kubenswrapper[4823]: I1216 06:58:28.854779 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8mn7l" podStartSLOduration=169.854726824 podStartE2EDuration="2m49.854726824s" podCreationTimestamp="2025-12-16 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:58:28.847882753 +0000 UTC m=+187.336448886" watchObservedRunningTime="2025-12-16 06:58:28.854726824 +0000 UTC m=+187.343292947" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.122843 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.242050 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kube-api-access\") pod \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.242237 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kubelet-dir\") pod \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\" (UID: \"17c65b6f-7fb6-494a-9f3e-43a79690df3d\") " Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.242384 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17c65b6f-7fb6-494a-9f3e-43a79690df3d" (UID: "17c65b6f-7fb6-494a-9f3e-43a79690df3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.242951 4823 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.249097 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17c65b6f-7fb6-494a-9f3e-43a79690df3d" (UID: "17c65b6f-7fb6-494a-9f3e-43a79690df3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.344997 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c65b6f-7fb6-494a-9f3e-43a79690df3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.499138 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 06:58:30 crc kubenswrapper[4823]: E1216 06:58:30.499451 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c65b6f-7fb6-494a-9f3e-43a79690df3d" containerName="pruner" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.499479 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c65b6f-7fb6-494a-9f3e-43a79690df3d" containerName="pruner" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.499635 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c65b6f-7fb6-494a-9f3e-43a79690df3d" containerName="pruner" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.500151 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.511075 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.547281 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.547339 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kube-api-access\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.547555 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-var-lock\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.649102 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.649175 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kube-api-access\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.649232 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-var-lock\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.649301 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.649330 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-var-lock\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.666359 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kube-api-access\") pod \"installer-9-crc\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.817540 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.831371 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"17c65b6f-7fb6-494a-9f3e-43a79690df3d","Type":"ContainerDied","Data":"5179ab4d66e8b2aa3f0fb438d63858fa68a75290209b85bcf42be3aab0a79aa9"} Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.831423 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5179ab4d66e8b2aa3f0fb438d63858fa68a75290209b85bcf42be3aab0a79aa9" Dec 16 06:58:30 crc kubenswrapper[4823]: I1216 06:58:30.831444 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:58:31 crc kubenswrapper[4823]: I1216 06:58:31.222821 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 06:58:31 crc kubenswrapper[4823]: I1216 06:58:31.837885 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa03a70b-1afb-4d1b-8bec-7a302a382b7d","Type":"ContainerStarted","Data":"eea4b81cf12a679633c6ece2bf1aeaa5831801a214e1bd4f0daf014b20cc7157"} Dec 16 06:58:31 crc kubenswrapper[4823]: I1216 06:58:31.838323 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa03a70b-1afb-4d1b-8bec-7a302a382b7d","Type":"ContainerStarted","Data":"9256445e61a3048aa3b4f5328a9770835773848cf6eb4e55a305421bd422fdf3"} Dec 16 06:58:31 crc kubenswrapper[4823]: I1216 06:58:31.859103 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.8590775449999999 podStartE2EDuration="1.859077545s" podCreationTimestamp="2025-12-16 06:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:58:31.854280073 +0000 UTC m=+190.342846196" watchObservedRunningTime="2025-12-16 06:58:31.859077545 +0000 UTC m=+190.347643678" Dec 16 06:58:38 crc kubenswrapper[4823]: I1216 06:58:38.884862 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerStarted","Data":"6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3"} Dec 16 06:58:39 crc kubenswrapper[4823]: I1216 06:58:39.903906 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerStarted","Data":"21d29dea4eafb3a0b6eee47ea02ac5561a08f9dfb0279bd28d47b6148c5a4cdd"} Dec 16 06:58:39 crc kubenswrapper[4823]: I1216 06:58:39.908176 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerStarted","Data":"360f902d076aa1b5fe2bef2e3bb3cadf362e0101fc89fdb607cb555c8bd359a3"} Dec 16 06:58:39 crc kubenswrapper[4823]: I1216 06:58:39.910922 4823 generic.go:334] "Generic (PLEG): container finished" podID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerID="6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3" exitCode=0 Dec 16 06:58:39 crc kubenswrapper[4823]: I1216 06:58:39.910989 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerDied","Data":"6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.919589 4823 generic.go:334] "Generic (PLEG): container finished" podID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerID="21d29dea4eafb3a0b6eee47ea02ac5561a08f9dfb0279bd28d47b6148c5a4cdd" exitCode=0 Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.919640 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerDied","Data":"21d29dea4eafb3a0b6eee47ea02ac5561a08f9dfb0279bd28d47b6148c5a4cdd"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.920429 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerStarted","Data":"6309251f3188d55bf3a3266a0e518a0db145309612c14f3387a166ec4dd26685"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.922673 4823 generic.go:334] "Generic (PLEG): container finished" podID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerID="4cfa95a0be96439067a0ce9b8781685fef6113e36ead9b993293993cdcdbf8d5" exitCode=0 Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.922715 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shg64" event={"ID":"726c1e86-0af2-45b0-bc89-af72df38eff8","Type":"ContainerDied","Data":"4cfa95a0be96439067a0ce9b8781685fef6113e36ead9b993293993cdcdbf8d5"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.928533 4823 generic.go:334] "Generic (PLEG): container finished" podID="7220fce6-80f1-4da7-9a90-f106616709ae" containerID="1130ff670ba478cd411804e58471dbf3533596ffc3a765ae7dfe735ca9c7ee15" exitCode=0 Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.928614 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x67m" event={"ID":"7220fce6-80f1-4da7-9a90-f106616709ae","Type":"ContainerDied","Data":"1130ff670ba478cd411804e58471dbf3533596ffc3a765ae7dfe735ca9c7ee15"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.937638 4823 generic.go:334] "Generic (PLEG): container finished" podID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerID="360f902d076aa1b5fe2bef2e3bb3cadf362e0101fc89fdb607cb555c8bd359a3" exitCode=0 Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.937710 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerDied","Data":"360f902d076aa1b5fe2bef2e3bb3cadf362e0101fc89fdb607cb555c8bd359a3"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.946669 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerStarted","Data":"b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92"} Dec 16 06:58:40 crc kubenswrapper[4823]: I1216 06:58:40.950008 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c98pm" podStartSLOduration=3.743234486 podStartE2EDuration="55.949987519s" podCreationTimestamp="2025-12-16 06:57:45 +0000 UTC" firstStartedPulling="2025-12-16 06:57:48.200826362 +0000 UTC m=+146.689392485" lastFinishedPulling="2025-12-16 06:58:40.407579395 +0000 UTC m=+198.896145518" observedRunningTime="2025-12-16 06:58:40.940094914 +0000 UTC m=+199.428661037" watchObservedRunningTime="2025-12-16 06:58:40.949987519 +0000 UTC m=+199.438553642" Dec 16 06:58:41 crc kubenswrapper[4823]: I1216 06:58:41.019079 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zgrf" podStartSLOduration=3.740400625 podStartE2EDuration="56.019054751s" podCreationTimestamp="2025-12-16 06:57:45 +0000 UTC" firstStartedPulling="2025-12-16 06:57:48.190967459 +0000 UTC m=+146.679533582" lastFinishedPulling="2025-12-16 06:58:40.469621585 +0000 UTC m=+198.958187708" observedRunningTime="2025-12-16 06:58:41.01693543 +0000 UTC m=+199.505501553" watchObservedRunningTime="2025-12-16 06:58:41.019054751 +0000 UTC m=+199.507620904" Dec 16 06:58:41 crc kubenswrapper[4823]: I1216 06:58:41.955054 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shg64" event={"ID":"726c1e86-0af2-45b0-bc89-af72df38eff8","Type":"ContainerStarted","Data":"b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451"} Dec 16 06:58:41 crc kubenswrapper[4823]: I1216 06:58:41.957572 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x67m" event={"ID":"7220fce6-80f1-4da7-9a90-f106616709ae","Type":"ContainerStarted","Data":"a34010f49d4634fa3e7a9d8c1e3ddb232092355af14f8a3d3f92fb6bb4779033"} Dec 16 06:58:41 crc kubenswrapper[4823]: I1216 06:58:41.960678 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerStarted","Data":"d244bf80e24864179e3b869c5e18a7ed9bb9718ac4a051319b4f010b759cf384"} Dec 16 06:58:42 crc kubenswrapper[4823]: I1216 06:58:42.008975 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shg64" podStartSLOduration=2.669971114 podStartE2EDuration="57.008938807s" podCreationTimestamp="2025-12-16 06:57:45 +0000 UTC" firstStartedPulling="2025-12-16 06:57:47.123731124 +0000 UTC m=+145.612297247" lastFinishedPulling="2025-12-16 06:58:41.462698817 +0000 UTC m=+199.951264940" observedRunningTime="2025-12-16 06:58:41.980483152 +0000 UTC m=+200.469049275" watchObservedRunningTime="2025-12-16 06:58:42.008938807 +0000 UTC m=+200.497504930" Dec 16 06:58:42 crc kubenswrapper[4823]: I1216 06:58:42.012558 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5x67m" podStartSLOduration=2.794348908 podStartE2EDuration="55.012547456s" podCreationTimestamp="2025-12-16 06:57:47 +0000 UTC" firstStartedPulling="2025-12-16 06:57:49.29956511 +0000 UTC m=+147.788131233" lastFinishedPulling="2025-12-16 06:58:41.517763658 +0000 UTC m=+200.006329781" observedRunningTime="2025-12-16 06:58:42.011514592 +0000 UTC m=+200.500080715" watchObservedRunningTime="2025-12-16 06:58:42.012547456 +0000 UTC m=+200.501113579" Dec 16 06:58:42 crc kubenswrapper[4823]: I1216 06:58:42.798679 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2jtdw" podStartSLOduration=3.779265024 podStartE2EDuration="54.798658183s" podCreationTimestamp="2025-12-16 06:57:48 +0000 UTC" firstStartedPulling="2025-12-16 06:57:50.318921609 +0000 UTC m=+148.807487732" lastFinishedPulling="2025-12-16 06:58:41.338314768 +0000 UTC m=+199.826880891" observedRunningTime="2025-12-16 06:58:42.036283976 +0000 UTC m=+200.524850099" watchObservedRunningTime="2025-12-16 06:58:42.798658183 +0000 UTC m=+201.287224296" Dec 16 06:58:42 crc kubenswrapper[4823]: I1216 06:58:42.978393 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerStarted","Data":"6f78f1142f7e79c8c4a764df250255e93523ee22c7bf60d944ac018ed366566e"} Dec 16 06:58:43 crc kubenswrapper[4823]: I1216 06:58:43.984439 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerStarted","Data":"d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374"} Dec 16 06:58:43 crc kubenswrapper[4823]: I1216 06:58:43.986691 4823 generic.go:334] "Generic (PLEG): container finished" podID="f658a770-db01-4cb4-8d83-e6dc10513860" containerID="6f78f1142f7e79c8c4a764df250255e93523ee22c7bf60d944ac018ed366566e" exitCode=0 Dec 16 06:58:43 crc kubenswrapper[4823]: I1216 06:58:43.986730 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerDied","Data":"6f78f1142f7e79c8c4a764df250255e93523ee22c7bf60d944ac018ed366566e"} Dec 16 06:58:44 crc kubenswrapper[4823]: I1216 06:58:44.995859 4823 generic.go:334] "Generic (PLEG): container finished" podID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerID="d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374" exitCode=0 Dec 16 06:58:44 crc kubenswrapper[4823]: I1216 06:58:44.996130 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerDied","Data":"d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374"} Dec 16 06:58:45 crc kubenswrapper[4823]: I1216 06:58:45.740776 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:58:45 crc kubenswrapper[4823]: I1216 06:58:45.741204 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:58:45 crc kubenswrapper[4823]: I1216 06:58:45.951105 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.039454 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shg64" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.148859 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.148904 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.192387 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.366378 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.366439 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:58:46 crc kubenswrapper[4823]: I1216 06:58:46.425657 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:58:47 crc kubenswrapper[4823]: I1216 06:58:47.051925 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:58:47 crc kubenswrapper[4823]: I1216 06:58:47.061508 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:58:47 crc kubenswrapper[4823]: I1216 06:58:47.877704 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:58:47 crc kubenswrapper[4823]: I1216 06:58:47.878685 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:58:47 crc kubenswrapper[4823]: I1216 06:58:47.921253 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:58:48 crc kubenswrapper[4823]: I1216 06:58:48.020951 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zgrf"] Dec 16 06:58:48 crc kubenswrapper[4823]: I1216 06:58:48.052449 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 06:58:48 crc kubenswrapper[4823]: I1216 06:58:48.891411 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:58:48 crc kubenswrapper[4823]: I1216 06:58:48.891478 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:58:48 crc kubenswrapper[4823]: I1216 06:58:48.933787 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:58:49 crc kubenswrapper[4823]: I1216 06:58:49.020552 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zgrf" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="registry-server" containerID="cri-o://b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92" gracePeriod=2 Dec 16 06:58:49 crc kubenswrapper[4823]: I1216 06:58:49.116967 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.417791 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c98pm"] Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.418316 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c98pm" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="registry-server" containerID="cri-o://6309251f3188d55bf3a3266a0e518a0db145309612c14f3387a166ec4dd26685" gracePeriod=2 Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.903801 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.983217 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-utilities\") pod \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.983326 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jndrd\" (UniqueName: \"kubernetes.io/projected/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-kube-api-access-jndrd\") pod \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.983348 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-catalog-content\") pod \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\" (UID: \"9e8470ac-325b-46ce-ac5d-6cbafc3c6164\") " Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.984117 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-utilities" (OuterVolumeSpecName: "utilities") pod "9e8470ac-325b-46ce-ac5d-6cbafc3c6164" (UID: "9e8470ac-325b-46ce-ac5d-6cbafc3c6164"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:58:50 crc kubenswrapper[4823]: I1216 06:58:50.990767 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-kube-api-access-jndrd" (OuterVolumeSpecName: "kube-api-access-jndrd") pod "9e8470ac-325b-46ce-ac5d-6cbafc3c6164" (UID: "9e8470ac-325b-46ce-ac5d-6cbafc3c6164"). InnerVolumeSpecName "kube-api-access-jndrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.004071 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.004111 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jndrd\" (UniqueName: \"kubernetes.io/projected/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-kube-api-access-jndrd\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.032554 4823 generic.go:334] "Generic (PLEG): container finished" podID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerID="b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92" exitCode=0 Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.032617 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerDied","Data":"b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92"} Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.032648 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zgrf" event={"ID":"9e8470ac-325b-46ce-ac5d-6cbafc3c6164","Type":"ContainerDied","Data":"4bc2d23b3e0977eadcd44ac5cc6ebbbd6914d1a2aa8b8fe248afc384ef3a8998"} Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.032667 4823 scope.go:117] "RemoveContainer" containerID="b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.032807 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zgrf" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.042008 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerStarted","Data":"50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8"} Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.042488 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e8470ac-325b-46ce-ac5d-6cbafc3c6164" (UID: "9e8470ac-325b-46ce-ac5d-6cbafc3c6164"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.044983 4823 generic.go:334] "Generic (PLEG): container finished" podID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerID="6309251f3188d55bf3a3266a0e518a0db145309612c14f3387a166ec4dd26685" exitCode=0 Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.045105 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerDied","Data":"6309251f3188d55bf3a3266a0e518a0db145309612c14f3387a166ec4dd26685"} Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.046849 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerStarted","Data":"ceedf2c81a1224da174afd8fc392df551e5d91fc2406e342e80a98e7eea3fd1b"} Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.048625 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerStarted","Data":"02fdc5933120c2a4e49cf90b63a971410755c06f057e0a6b76618dfa66622202"} Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.058243 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdctr" podStartSLOduration=2.542040029 podStartE2EDuration="1m4.058195374s" podCreationTimestamp="2025-12-16 06:57:47 +0000 UTC" firstStartedPulling="2025-12-16 06:57:49.2501178 +0000 UTC m=+147.738683923" lastFinishedPulling="2025-12-16 06:58:50.766273145 +0000 UTC m=+209.254839268" observedRunningTime="2025-12-16 06:58:51.057593394 +0000 UTC m=+209.546159517" watchObservedRunningTime="2025-12-16 06:58:51.058195374 +0000 UTC m=+209.546761497" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.064178 4823 scope.go:117] "RemoveContainer" containerID="6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.075839 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tls89" podStartSLOduration=3.782687813 podStartE2EDuration="1m3.075717109s" podCreationTimestamp="2025-12-16 06:57:48 +0000 UTC" firstStartedPulling="2025-12-16 06:57:51.377313135 +0000 UTC m=+149.865879258" lastFinishedPulling="2025-12-16 06:58:50.670342431 +0000 UTC m=+209.158908554" observedRunningTime="2025-12-16 06:58:51.073379213 +0000 UTC m=+209.561945336" watchObservedRunningTime="2025-12-16 06:58:51.075717109 +0000 UTC m=+209.564283232" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.090379 4823 scope.go:117] "RemoveContainer" containerID="3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.105735 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8470ac-325b-46ce-ac5d-6cbafc3c6164-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.109744 4823 scope.go:117] "RemoveContainer" containerID="b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92" Dec 16 06:58:51 crc kubenswrapper[4823]: E1216 06:58:51.110330 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92\": container with ID starting with b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92 not found: ID does not exist" containerID="b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.110370 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92"} err="failed to get container status \"b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92\": rpc error: code = NotFound desc = could not find container \"b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92\": container with ID starting with b760dd048b51445623df227447fed49cd5df67f7449ce4484c7fef86250b0d92 not found: ID does not exist" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.110555 4823 scope.go:117] "RemoveContainer" containerID="6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3" Dec 16 06:58:51 crc kubenswrapper[4823]: E1216 06:58:51.110932 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3\": container with ID starting with 6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3 not found: ID does not exist" containerID="6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.110984 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3"} err="failed to get container status \"6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3\": rpc error: code = NotFound desc = could not find container \"6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3\": container with ID starting with 6e52bd1627de5b7422160a9fe8351580ee5cb8c76a14232c9a60b4287f2c07b3 not found: ID does not exist" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.111034 4823 scope.go:117] "RemoveContainer" containerID="3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5" Dec 16 06:58:51 crc kubenswrapper[4823]: E1216 06:58:51.111554 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5\": container with ID starting with 3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5 not found: ID does not exist" containerID="3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.111591 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5"} err="failed to get container status \"3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5\": rpc error: code = NotFound desc = could not find container \"3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5\": container with ID starting with 3dad0a00dbe2891d542c2a68d616b2402c380a18e5be93fdfc977e5da88fdff5 not found: ID does not exist" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.306328 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.368262 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zgrf"] Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.370829 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zgrf"] Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.410421 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-utilities\") pod \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.410513 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75jkw\" (UniqueName: \"kubernetes.io/projected/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-kube-api-access-75jkw\") pod \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.410802 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-catalog-content\") pod \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\" (UID: \"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c\") " Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.411428 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-utilities" (OuterVolumeSpecName: "utilities") pod "28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" (UID: "28fa6f6b-657a-4fe7-993f-6c97d5e53b3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.412107 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.415673 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-kube-api-access-75jkw" (OuterVolumeSpecName: "kube-api-access-75jkw") pod "28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" (UID: "28fa6f6b-657a-4fe7-993f-6c97d5e53b3c"). InnerVolumeSpecName "kube-api-access-75jkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.513812 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75jkw\" (UniqueName: \"kubernetes.io/projected/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-kube-api-access-75jkw\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.779255 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" path="/var/lib/kubelet/pods/9e8470ac-325b-46ce-ac5d-6cbafc3c6164/volumes" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.879178 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" (UID: "28fa6f6b-657a-4fe7-993f-6c97d5e53b3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:58:51 crc kubenswrapper[4823]: I1216 06:58:51.920740 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.057570 4823 generic.go:334] "Generic (PLEG): container finished" podID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerID="ceedf2c81a1224da174afd8fc392df551e5d91fc2406e342e80a98e7eea3fd1b" exitCode=0 Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.057650 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerDied","Data":"ceedf2c81a1224da174afd8fc392df551e5d91fc2406e342e80a98e7eea3fd1b"} Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.064656 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c98pm" event={"ID":"28fa6f6b-657a-4fe7-993f-6c97d5e53b3c","Type":"ContainerDied","Data":"32ee30f88508d04fa1a8fb55016cf0250c2057be86416af7b71b89bb593021f0"} Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.064664 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c98pm" Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.064947 4823 scope.go:117] "RemoveContainer" containerID="6309251f3188d55bf3a3266a0e518a0db145309612c14f3387a166ec4dd26685" Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.082104 4823 scope.go:117] "RemoveContainer" containerID="21d29dea4eafb3a0b6eee47ea02ac5561a08f9dfb0279bd28d47b6148c5a4cdd" Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.100470 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c98pm"] Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.106735 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c98pm"] Dec 16 06:58:52 crc kubenswrapper[4823]: I1216 06:58:52.114988 4823 scope.go:117] "RemoveContainer" containerID="2a74898e74c7e810093cab0f1d311901b1a1948664709995aa7195175bc61a8b" Dec 16 06:58:53 crc kubenswrapper[4823]: I1216 06:58:53.778908 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" path="/var/lib/kubelet/pods/28fa6f6b-657a-4fe7-993f-6c97d5e53b3c/volumes" Dec 16 06:58:54 crc kubenswrapper[4823]: I1216 06:58:54.077861 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerStarted","Data":"c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc"} Dec 16 06:58:54 crc kubenswrapper[4823]: I1216 06:58:54.101753 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhp8l" podStartSLOduration=4.264482351 podStartE2EDuration="1m9.101725874s" podCreationTimestamp="2025-12-16 06:57:45 +0000 UTC" firstStartedPulling="2025-12-16 06:57:48.186450606 +0000 UTC m=+146.675016729" lastFinishedPulling="2025-12-16 06:58:53.023694129 +0000 UTC m=+211.512260252" observedRunningTime="2025-12-16 06:58:54.100222684 +0000 UTC m=+212.588788807" watchObservedRunningTime="2025-12-16 06:58:54.101725874 +0000 UTC m=+212.590291997" Dec 16 06:58:55 crc kubenswrapper[4823]: I1216 06:58:55.882091 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:58:55 crc kubenswrapper[4823]: I1216 06:58:55.882559 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:58:55 crc kubenswrapper[4823]: I1216 06:58:55.922858 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.133676 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.133767 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.133832 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.134464 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.134520 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536" gracePeriod=600 Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.300374 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.300850 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:58:58 crc kubenswrapper[4823]: I1216 06:58:58.340579 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:58:59 crc kubenswrapper[4823]: I1216 06:58:59.153245 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:58:59 crc kubenswrapper[4823]: I1216 06:58:59.299059 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:58:59 crc kubenswrapper[4823]: I1216 06:58:59.299115 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:58:59 crc kubenswrapper[4823]: I1216 06:58:59.339251 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:59:00 crc kubenswrapper[4823]: I1216 06:59:00.122632 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536" exitCode=0 Dec 16 06:59:00 crc kubenswrapper[4823]: I1216 06:59:00.122698 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536"} Dec 16 06:59:00 crc kubenswrapper[4823]: I1216 06:59:00.123232 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"3c937cb280cb5355361e27a4b204cc11ced2636f489f0b890dda44110baac59b"} Dec 16 06:59:00 crc kubenswrapper[4823]: I1216 06:59:00.222612 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.421420 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdctr"] Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.422151 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdctr" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="registry-server" containerID="cri-o://50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8" gracePeriod=2 Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.802450 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.882099 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnrv2\" (UniqueName: \"kubernetes.io/projected/af944c3a-0424-490f-b445-b2ee72a3af0c-kube-api-access-qnrv2\") pod \"af944c3a-0424-490f-b445-b2ee72a3af0c\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.882150 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-utilities\") pod \"af944c3a-0424-490f-b445-b2ee72a3af0c\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.882224 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-catalog-content\") pod \"af944c3a-0424-490f-b445-b2ee72a3af0c\" (UID: \"af944c3a-0424-490f-b445-b2ee72a3af0c\") " Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.883395 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-utilities" (OuterVolumeSpecName: "utilities") pod "af944c3a-0424-490f-b445-b2ee72a3af0c" (UID: "af944c3a-0424-490f-b445-b2ee72a3af0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.883798 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.887588 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af944c3a-0424-490f-b445-b2ee72a3af0c-kube-api-access-qnrv2" (OuterVolumeSpecName: "kube-api-access-qnrv2") pod "af944c3a-0424-490f-b445-b2ee72a3af0c" (UID: "af944c3a-0424-490f-b445-b2ee72a3af0c"). InnerVolumeSpecName "kube-api-access-qnrv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.901091 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af944c3a-0424-490f-b445-b2ee72a3af0c" (UID: "af944c3a-0424-490f-b445-b2ee72a3af0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.985206 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af944c3a-0424-490f-b445-b2ee72a3af0c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:01 crc kubenswrapper[4823]: I1216 06:59:01.985316 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnrv2\" (UniqueName: \"kubernetes.io/projected/af944c3a-0424-490f-b445-b2ee72a3af0c-kube-api-access-qnrv2\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.136558 4823 generic.go:334] "Generic (PLEG): container finished" podID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerID="50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8" exitCode=0 Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.136627 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerDied","Data":"50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8"} Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.136640 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdctr" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.136661 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdctr" event={"ID":"af944c3a-0424-490f-b445-b2ee72a3af0c","Type":"ContainerDied","Data":"d23649003adb8d2768086f8360b07f8324e52d201d0d9bcff7f95c1ea98f3ac3"} Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.136680 4823 scope.go:117] "RemoveContainer" containerID="50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.157120 4823 scope.go:117] "RemoveContainer" containerID="d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.171444 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdctr"] Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.177181 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdctr"] Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.194657 4823 scope.go:117] "RemoveContainer" containerID="cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.210998 4823 scope.go:117] "RemoveContainer" containerID="50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8" Dec 16 06:59:02 crc kubenswrapper[4823]: E1216 06:59:02.212161 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8\": container with ID starting with 50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8 not found: ID does not exist" containerID="50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.212213 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8"} err="failed to get container status \"50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8\": rpc error: code = NotFound desc = could not find container \"50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8\": container with ID starting with 50d2f42221943fc2437f32b83283a4a1f7032f7b643f87d130aec0566786bcd8 not found: ID does not exist" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.212254 4823 scope.go:117] "RemoveContainer" containerID="d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374" Dec 16 06:59:02 crc kubenswrapper[4823]: E1216 06:59:02.212664 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374\": container with ID starting with d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374 not found: ID does not exist" containerID="d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.212687 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374"} err="failed to get container status \"d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374\": rpc error: code = NotFound desc = could not find container \"d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374\": container with ID starting with d9309e9b8bd4c700c73627946665b9a73b24d9473ff8969ffeb5b3fce67e1374 not found: ID does not exist" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.212699 4823 scope.go:117] "RemoveContainer" containerID="cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25" Dec 16 06:59:02 crc kubenswrapper[4823]: E1216 06:59:02.213275 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25\": container with ID starting with cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25 not found: ID does not exist" containerID="cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25" Dec 16 06:59:02 crc kubenswrapper[4823]: I1216 06:59:02.213336 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25"} err="failed to get container status \"cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25\": rpc error: code = NotFound desc = could not find container \"cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25\": container with ID starting with cdd3d4b6177b6d1fc58f7786a38e95541617a2ae1c29265bce4c69df97cf2d25 not found: ID does not exist" Dec 16 06:59:03 crc kubenswrapper[4823]: I1216 06:59:03.784873 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" path="/var/lib/kubelet/pods/af944c3a-0424-490f-b445-b2ee72a3af0c/volumes" Dec 16 06:59:03 crc kubenswrapper[4823]: I1216 06:59:03.828373 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tls89"] Dec 16 06:59:03 crc kubenswrapper[4823]: I1216 06:59:03.829163 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tls89" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="registry-server" containerID="cri-o://02fdc5933120c2a4e49cf90b63a971410755c06f057e0a6b76618dfa66622202" gracePeriod=2 Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.166431 4823 generic.go:334] "Generic (PLEG): container finished" podID="f658a770-db01-4cb4-8d83-e6dc10513860" containerID="02fdc5933120c2a4e49cf90b63a971410755c06f057e0a6b76618dfa66622202" exitCode=0 Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.166964 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerDied","Data":"02fdc5933120c2a4e49cf90b63a971410755c06f057e0a6b76618dfa66622202"} Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.257857 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.323097 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n9fs\" (UniqueName: \"kubernetes.io/projected/f658a770-db01-4cb4-8d83-e6dc10513860-kube-api-access-9n9fs\") pod \"f658a770-db01-4cb4-8d83-e6dc10513860\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.323238 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-catalog-content\") pod \"f658a770-db01-4cb4-8d83-e6dc10513860\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.323273 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-utilities\") pod \"f658a770-db01-4cb4-8d83-e6dc10513860\" (UID: \"f658a770-db01-4cb4-8d83-e6dc10513860\") " Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.324465 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-utilities" (OuterVolumeSpecName: "utilities") pod "f658a770-db01-4cb4-8d83-e6dc10513860" (UID: "f658a770-db01-4cb4-8d83-e6dc10513860"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.324801 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.333562 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f658a770-db01-4cb4-8d83-e6dc10513860-kube-api-access-9n9fs" (OuterVolumeSpecName: "kube-api-access-9n9fs") pod "f658a770-db01-4cb4-8d83-e6dc10513860" (UID: "f658a770-db01-4cb4-8d83-e6dc10513860"). InnerVolumeSpecName "kube-api-access-9n9fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.426276 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n9fs\" (UniqueName: \"kubernetes.io/projected/f658a770-db01-4cb4-8d83-e6dc10513860-kube-api-access-9n9fs\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.452247 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f658a770-db01-4cb4-8d83-e6dc10513860" (UID: "f658a770-db01-4cb4-8d83-e6dc10513860"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:59:04 crc kubenswrapper[4823]: I1216 06:59:04.527321 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f658a770-db01-4cb4-8d83-e6dc10513860-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.175867 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tls89" event={"ID":"f658a770-db01-4cb4-8d83-e6dc10513860","Type":"ContainerDied","Data":"6836c95e4b4b6dd3dfe6800dc3876bf08648b3fbc717d4ba326de4e9e9877b6c"} Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.177431 4823 scope.go:117] "RemoveContainer" containerID="02fdc5933120c2a4e49cf90b63a971410755c06f057e0a6b76618dfa66622202" Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.175929 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tls89" Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.196896 4823 scope.go:117] "RemoveContainer" containerID="6f78f1142f7e79c8c4a764df250255e93523ee22c7bf60d944ac018ed366566e" Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.214357 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tls89"] Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.220792 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tls89"] Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.228475 4823 scope.go:117] "RemoveContainer" containerID="05d0aa39fba1d1a8df245990a2dbde6985a8939f63c5225d847104d6171d7cec" Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.779735 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" path="/var/lib/kubelet/pods/f658a770-db01-4cb4-8d83-e6dc10513860/volumes" Dec 16 06:59:05 crc kubenswrapper[4823]: I1216 06:59:05.933397 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 06:59:06 crc kubenswrapper[4823]: I1216 06:59:06.017834 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6xfbm"] Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.302219 4823 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.303434 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf" gracePeriod=15 Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.303538 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5" gracePeriod=15 Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.303626 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c" gracePeriod=15 Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.303734 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925" gracePeriod=15 Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.303831 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8" gracePeriod=15 Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.304615 4823 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.304925 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.304948 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.304959 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.304966 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.304977 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.304983 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.304995 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305002 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305011 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305019 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305042 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305053 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305067 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305077 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305086 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305092 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305102 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305109 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305120 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305127 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305139 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305147 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305158 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305164 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305175 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305181 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="extract-utilities" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305189 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305195 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305201 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305207 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305215 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305220 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305228 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305234 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="extract-content" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305243 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305249 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305374 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305386 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305400 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fa6f6b-657a-4fe7-993f-6c97d5e53b3c" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305409 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f658a770-db01-4cb4-8d83-e6dc10513860" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305420 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305429 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="af944c3a-0424-490f-b445-b2ee72a3af0c" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305438 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305446 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e8470ac-325b-46ce-ac5d-6cbafc3c6164" containerName="registry-server" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305454 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305464 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 06:59:09 crc kubenswrapper[4823]: E1216 06:59:09.305557 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.305563 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.307082 4823 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.307802 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.313051 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400545 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400621 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400651 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400683 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400715 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400766 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400838 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.400925 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502319 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502381 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502414 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502464 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502506 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502538 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502477 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502594 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502507 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502632 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502654 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502687 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502465 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502766 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:09 crc kubenswrapper[4823]: I1216 06:59:09.502889 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.205616 4823 generic.go:334] "Generic (PLEG): container finished" podID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" containerID="eea4b81cf12a679633c6ece2bf1aeaa5831801a214e1bd4f0daf014b20cc7157" exitCode=0 Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.205690 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa03a70b-1afb-4d1b-8bec-7a302a382b7d","Type":"ContainerDied","Data":"eea4b81cf12a679633c6ece2bf1aeaa5831801a214e1bd4f0daf014b20cc7157"} Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.207078 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.210110 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.212660 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.213834 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8" exitCode=0 Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.213887 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5" exitCode=0 Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.213908 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c" exitCode=0 Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.213928 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925" exitCode=2 Dec 16 06:59:10 crc kubenswrapper[4823]: I1216 06:59:10.213959 4823 scope.go:117] "RemoveContainer" containerID="9c9e6ec333ac552dc972c31cce23104cc27bcc2cf11bdc52d89ab36172915a41" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.222595 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.667861 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.669716 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.673553 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.674311 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.675068 4823 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.675455 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.739683 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.739886 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.739883 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-var-lock\") pod \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.739995 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-var-lock" (OuterVolumeSpecName: "var-lock") pod "fa03a70b-1afb-4d1b-8bec-7a302a382b7d" (UID: "fa03a70b-1afb-4d1b-8bec-7a302a382b7d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740134 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kubelet-dir\") pod \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740227 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa03a70b-1afb-4d1b-8bec-7a302a382b7d" (UID: "fa03a70b-1afb-4d1b-8bec-7a302a382b7d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740405 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kube-api-access\") pod \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\" (UID: \"fa03a70b-1afb-4d1b-8bec-7a302a382b7d\") " Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740539 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740642 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740787 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.740898 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.741672 4823 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.741723 4823 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.741748 4823 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.741770 4823 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.741792 4823 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.747497 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa03a70b-1afb-4d1b-8bec-7a302a382b7d" (UID: "fa03a70b-1afb-4d1b-8bec-7a302a382b7d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.776192 4823 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.776699 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.782577 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 16 06:59:11 crc kubenswrapper[4823]: I1216 06:59:11.843470 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa03a70b-1afb-4d1b-8bec-7a302a382b7d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.233662 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.236376 4823 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf" exitCode=0 Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.236511 4823 scope.go:117] "RemoveContainer" containerID="7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.236646 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.238434 4823 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.239336 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.239917 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa03a70b-1afb-4d1b-8bec-7a302a382b7d","Type":"ContainerDied","Data":"9256445e61a3048aa3b4f5328a9770835773848cf6eb4e55a305421bd422fdf3"} Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.240053 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.240184 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9256445e61a3048aa3b4f5328a9770835773848cf6eb4e55a305421bd422fdf3" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.244743 4823 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.245109 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.251143 4823 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.251872 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.257144 4823 scope.go:117] "RemoveContainer" containerID="c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.274867 4823 scope.go:117] "RemoveContainer" containerID="464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.299610 4823 scope.go:117] "RemoveContainer" containerID="a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.323813 4823 scope.go:117] "RemoveContainer" containerID="c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.348966 4823 scope.go:117] "RemoveContainer" containerID="84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.376033 4823 scope.go:117] "RemoveContainer" containerID="7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8" Dec 16 06:59:12 crc kubenswrapper[4823]: E1216 06:59:12.376778 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\": container with ID starting with 7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8 not found: ID does not exist" containerID="7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.376837 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8"} err="failed to get container status \"7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\": rpc error: code = NotFound desc = could not find container \"7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8\": container with ID starting with 7ddfb7d6dae98465369260c8e9c64a2aa21dbf226e7898f6de0c36b3806bcae8 not found: ID does not exist" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.376876 4823 scope.go:117] "RemoveContainer" containerID="c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5" Dec 16 06:59:12 crc kubenswrapper[4823]: E1216 06:59:12.377437 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\": container with ID starting with c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5 not found: ID does not exist" containerID="c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.377463 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5"} err="failed to get container status \"c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\": rpc error: code = NotFound desc = could not find container \"c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5\": container with ID starting with c84e7f6950c8d433823b7d5ae3d8a9ecaa70ab6845ce607b8bb93efa990325a5 not found: ID does not exist" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.377476 4823 scope.go:117] "RemoveContainer" containerID="464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c" Dec 16 06:59:12 crc kubenswrapper[4823]: E1216 06:59:12.377856 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\": container with ID starting with 464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c not found: ID does not exist" containerID="464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.377886 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c"} err="failed to get container status \"464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\": rpc error: code = NotFound desc = could not find container \"464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c\": container with ID starting with 464ab3cf14920ae3f947ce3dccacd7379d50380699008411d0a51bd83fe3e51c not found: ID does not exist" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.377903 4823 scope.go:117] "RemoveContainer" containerID="a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925" Dec 16 06:59:12 crc kubenswrapper[4823]: E1216 06:59:12.378263 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\": container with ID starting with a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925 not found: ID does not exist" containerID="a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.378285 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925"} err="failed to get container status \"a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\": rpc error: code = NotFound desc = could not find container \"a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925\": container with ID starting with a20f7e0e3c07f467643afdcc6f0e333353d2d10b6d0c0d7aff0d347d33a5d925 not found: ID does not exist" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.378296 4823 scope.go:117] "RemoveContainer" containerID="c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf" Dec 16 06:59:12 crc kubenswrapper[4823]: E1216 06:59:12.378632 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\": container with ID starting with c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf not found: ID does not exist" containerID="c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.378678 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf"} err="failed to get container status \"c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\": rpc error: code = NotFound desc = could not find container \"c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf\": container with ID starting with c33a30aed9756ef012ff9046309ec07897de3b17707ea39c621534ed863c3adf not found: ID does not exist" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.378711 4823 scope.go:117] "RemoveContainer" containerID="84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d" Dec 16 06:59:12 crc kubenswrapper[4823]: E1216 06:59:12.379158 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\": container with ID starting with 84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d not found: ID does not exist" containerID="84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d" Dec 16 06:59:12 crc kubenswrapper[4823]: I1216 06:59:12.379255 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d"} err="failed to get container status \"84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\": rpc error: code = NotFound desc = could not find container \"84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d\": container with ID starting with 84f11d2687959c8525b131966a998af137caa69a480bb31a94e1615a3ccbea5d not found: ID does not exist" Dec 16 06:59:14 crc kubenswrapper[4823]: E1216 06:59:14.335417 4823 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.180:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:14 crc kubenswrapper[4823]: I1216 06:59:14.336593 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:14 crc kubenswrapper[4823]: E1216 06:59:14.358669 4823 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.180:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18819fe98d9edff1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 06:59:14.358140913 +0000 UTC m=+232.846707036,LastTimestamp:2025-12-16 06:59:14.358140913 +0000 UTC m=+232.846707036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 06:59:15 crc kubenswrapper[4823]: I1216 06:59:15.261839 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366"} Dec 16 06:59:15 crc kubenswrapper[4823]: I1216 06:59:15.262459 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0d445067cb903419fe0f40b5a29f481f32ba732c08b3ed61a87881cbe2765e5a"} Dec 16 06:59:15 crc kubenswrapper[4823]: I1216 06:59:15.263166 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:15 crc kubenswrapper[4823]: E1216 06:59:15.263186 4823 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.180:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.186244 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:59:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:59:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:59:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:59:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.187169 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.187691 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.188148 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.188692 4823 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.188743 4823 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.634731 4823 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.635344 4823 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.635918 4823 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.636606 4823 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.637556 4823 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:16 crc kubenswrapper[4823]: I1216 06:59:16.637618 4823 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.638149 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="200ms" Dec 16 06:59:16 crc kubenswrapper[4823]: E1216 06:59:16.840169 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="400ms" Dec 16 06:59:17 crc kubenswrapper[4823]: E1216 06:59:17.068834 4823 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.180:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18819fe98d9edff1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 06:59:14.358140913 +0000 UTC m=+232.846707036,LastTimestamp:2025-12-16 06:59:14.358140913 +0000 UTC m=+232.846707036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 06:59:17 crc kubenswrapper[4823]: E1216 06:59:17.241624 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="800ms" Dec 16 06:59:18 crc kubenswrapper[4823]: E1216 06:59:18.043001 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="1.6s" Dec 16 06:59:19 crc kubenswrapper[4823]: E1216 06:59:19.643844 4823 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="3.2s" Dec 16 06:59:20 crc kubenswrapper[4823]: I1216 06:59:20.771159 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:20 crc kubenswrapper[4823]: I1216 06:59:20.772674 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:20 crc kubenswrapper[4823]: I1216 06:59:20.791307 4823 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:20 crc kubenswrapper[4823]: I1216 06:59:20.791353 4823 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:20 crc kubenswrapper[4823]: E1216 06:59:20.791724 4823 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:20 crc kubenswrapper[4823]: I1216 06:59:20.792370 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:21 crc kubenswrapper[4823]: I1216 06:59:21.301791 4823 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="40c6b712635fce2df44f90cec6b83f8d6756db83a7b33473e9dc7c30e5ef52e5" exitCode=0 Dec 16 06:59:21 crc kubenswrapper[4823]: I1216 06:59:21.301895 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"40c6b712635fce2df44f90cec6b83f8d6756db83a7b33473e9dc7c30e5ef52e5"} Dec 16 06:59:21 crc kubenswrapper[4823]: I1216 06:59:21.302303 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9211afa2a73b37f8f3f9d7081c4ed8634fc1c1f35ea0ea7357a0e70378a8b61b"} Dec 16 06:59:21 crc kubenswrapper[4823]: I1216 06:59:21.302736 4823 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:21 crc kubenswrapper[4823]: I1216 06:59:21.302755 4823 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:21 crc kubenswrapper[4823]: I1216 06:59:21.303627 4823 status_manager.go:851] "Failed to get status for pod" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" Dec 16 06:59:21 crc kubenswrapper[4823]: E1216 06:59:21.303742 4823 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.180:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.192859 4823 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.193355 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.321326 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.321399 4823 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce" exitCode=1 Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.321492 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce"} Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.322206 4823 scope.go:117] "RemoveContainer" containerID="275328a704b475b037c82f92055f9ca7f83b9821325603c5e34090140cc486ce" Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.330294 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f27bbdd6c5d20fcc31497f67bc86153a19ac5fd1865503a93505d45c3372ed29"} Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.330364 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a4fcd921c675935638c02ced3a43e650e58cdd814d39b5c5be615f64d9120ee1"} Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.330382 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6685e670d81069ded0a092c775b5c1cca497aee24dd5da3d1ccdc0849275079e"} Dec 16 06:59:22 crc kubenswrapper[4823]: I1216 06:59:22.330395 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ace9ea8dc4766f9d3f06b45f0a860ff23bb5cc7d8b96f25278e2193935cfcdd7"} Dec 16 06:59:23 crc kubenswrapper[4823]: I1216 06:59:23.338720 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 06:59:23 crc kubenswrapper[4823]: I1216 06:59:23.338836 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c8093f0d82491f272752a1dfb36232583a0034440d1c9fd4811e26b5c0ee880f"} Dec 16 06:59:23 crc kubenswrapper[4823]: I1216 06:59:23.342443 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"74d008e835fc566b55f2095227f803dc1de25592bbf37ae5318a7cf36c0869db"} Dec 16 06:59:23 crc kubenswrapper[4823]: I1216 06:59:23.342756 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:23 crc kubenswrapper[4823]: I1216 06:59:23.342858 4823 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:23 crc kubenswrapper[4823]: I1216 06:59:23.342897 4823 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:25 crc kubenswrapper[4823]: I1216 06:59:25.793524 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:25 crc kubenswrapper[4823]: I1216 06:59:25.794433 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:25 crc kubenswrapper[4823]: I1216 06:59:25.800268 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:28 crc kubenswrapper[4823]: I1216 06:59:28.358660 4823 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:29 crc kubenswrapper[4823]: I1216 06:59:29.378109 4823 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:29 crc kubenswrapper[4823]: I1216 06:59:29.378159 4823 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:29 crc kubenswrapper[4823]: I1216 06:59:29.383072 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:29 crc kubenswrapper[4823]: I1216 06:59:29.386321 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6e3745b7-cfaf-4470-9590-8c8b06635bc2" Dec 16 06:59:30 crc kubenswrapper[4823]: I1216 06:59:30.383165 4823 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:30 crc kubenswrapper[4823]: I1216 06:59:30.383198 4823 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c915dd50-9820-494e-b47a-987257910a57" Dec 16 06:59:30 crc kubenswrapper[4823]: I1216 06:59:30.685860 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:59:30 crc kubenswrapper[4823]: I1216 06:59:30.689486 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.061087 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" podUID="60b58907-b6e9-4a6d-b442-9c79d839bac9" containerName="oauth-openshift" containerID="cri-o://2111ccddf541caab762abd8a48bce78742d90876655f80bec411b32630271e9e" gracePeriod=15 Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.388962 4823 generic.go:334] "Generic (PLEG): container finished" podID="60b58907-b6e9-4a6d-b442-9c79d839bac9" containerID="2111ccddf541caab762abd8a48bce78742d90876655f80bec411b32630271e9e" exitCode=0 Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.389887 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" event={"ID":"60b58907-b6e9-4a6d-b442-9c79d839bac9","Type":"ContainerDied","Data":"2111ccddf541caab762abd8a48bce78742d90876655f80bec411b32630271e9e"} Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.389923 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.452944 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.536948 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-trusted-ca-bundle\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.536997 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-login\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537048 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-dir\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537081 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-ocp-branding-template\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537114 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-idp-0-file-data\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537152 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-serving-cert\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537177 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-provider-selection\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537182 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537212 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-error\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537245 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-service-ca\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537269 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6frp\" (UniqueName: \"kubernetes.io/projected/60b58907-b6e9-4a6d-b442-9c79d839bac9-kube-api-access-d6frp\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537290 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-session\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537335 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-router-certs\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537355 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-policies\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537379 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-cliconfig\") pod \"60b58907-b6e9-4a6d-b442-9c79d839bac9\" (UID: \"60b58907-b6e9-4a6d-b442-9c79d839bac9\") " Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.537741 4823 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.538317 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.540071 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.540426 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.545529 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.546459 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.546823 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.547108 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.547719 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.548234 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.548780 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.549043 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b58907-b6e9-4a6d-b442-9c79d839bac9-kube-api-access-d6frp" (OuterVolumeSpecName: "kube-api-access-d6frp") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "kube-api-access-d6frp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.555731 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.555865 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "60b58907-b6e9-4a6d-b442-9c79d839bac9" (UID: "60b58907-b6e9-4a6d-b442-9c79d839bac9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.638941 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639328 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639406 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639467 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639530 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639586 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639677 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6frp\" (UniqueName: \"kubernetes.io/projected/60b58907-b6e9-4a6d-b442-9c79d839bac9-kube-api-access-d6frp\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639755 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639826 4823 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639904 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.639968 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.640053 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.640146 4823 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/60b58907-b6e9-4a6d-b442-9c79d839bac9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:31 crc kubenswrapper[4823]: I1216 06:59:31.790921 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6e3745b7-cfaf-4470-9590-8c8b06635bc2" Dec 16 06:59:32 crc kubenswrapper[4823]: I1216 06:59:32.197848 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:59:32 crc kubenswrapper[4823]: I1216 06:59:32.398046 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" event={"ID":"60b58907-b6e9-4a6d-b442-9c79d839bac9","Type":"ContainerDied","Data":"d27c550266359180bf82b7adc42e7107fc9b7a4bf212f5a3366fd5fbb7fce0a7"} Dec 16 06:59:32 crc kubenswrapper[4823]: I1216 06:59:32.398148 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6xfbm" Dec 16 06:59:32 crc kubenswrapper[4823]: I1216 06:59:32.398149 4823 scope.go:117] "RemoveContainer" containerID="2111ccddf541caab762abd8a48bce78742d90876655f80bec411b32630271e9e" Dec 16 06:59:34 crc kubenswrapper[4823]: I1216 06:59:34.852854 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 06:59:36 crc kubenswrapper[4823]: I1216 06:59:36.113984 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 06:59:37 crc kubenswrapper[4823]: I1216 06:59:37.260906 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 06:59:37 crc kubenswrapper[4823]: I1216 06:59:37.326993 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 06:59:39 crc kubenswrapper[4823]: I1216 06:59:39.261903 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 06:59:39 crc kubenswrapper[4823]: I1216 06:59:39.264215 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 06:59:39 crc kubenswrapper[4823]: I1216 06:59:39.763099 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 06:59:39 crc kubenswrapper[4823]: I1216 06:59:39.934734 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.071130 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.364520 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.492050 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.528188 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.735964 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.857188 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 06:59:40 crc kubenswrapper[4823]: I1216 06:59:40.919702 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 06:59:41 crc kubenswrapper[4823]: I1216 06:59:41.406222 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:59:41 crc kubenswrapper[4823]: I1216 06:59:41.637992 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 06:59:41 crc kubenswrapper[4823]: I1216 06:59:41.750491 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 06:59:41 crc kubenswrapper[4823]: I1216 06:59:41.896785 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 06:59:41 crc kubenswrapper[4823]: I1216 06:59:41.950571 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.089543 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.158925 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.267035 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.409537 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.473529 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.597554 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.694975 4823 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.699980 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-6xfbm"] Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.700078 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.708845 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.725994 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.731526 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.734900 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.734870418 podStartE2EDuration="14.734870418s" podCreationTimestamp="2025-12-16 06:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:59:42.731260129 +0000 UTC m=+261.219826292" watchObservedRunningTime="2025-12-16 06:59:42.734870418 +0000 UTC m=+261.223436571" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.898504 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.905876 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 06:59:42 crc kubenswrapper[4823]: I1216 06:59:42.925618 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.019047 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.045163 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.251853 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.310861 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.325156 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.484621 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.732754 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.779884 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b58907-b6e9-4a6d-b442-9c79d839bac9" path="/var/lib/kubelet/pods/60b58907-b6e9-4a6d-b442-9c79d839bac9/volumes" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.933866 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 06:59:43 crc kubenswrapper[4823]: I1216 06:59:43.948451 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.113400 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.155135 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.216012 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.227705 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.229307 4823 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.361253 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.430601 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.469015 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.469707 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.529135 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.533485 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.547198 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.622333 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.650207 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.827660 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.884484 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.928571 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.944465 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 06:59:44 crc kubenswrapper[4823]: I1216 06:59:44.964083 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.036377 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.235212 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.333689 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.388311 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.437936 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.478123 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.579404 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.648261 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.717753 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.772358 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.804972 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.820342 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.859050 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 06:59:45 crc kubenswrapper[4823]: I1216 06:59:45.881237 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.012082 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.069693 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.085233 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.085514 4823 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.086552 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.097671 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.132915 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.316441 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.463821 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.465452 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.483081 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.515461 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.588903 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.670441 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55889b984c-zdsfp"] Dec 16 06:59:46 crc kubenswrapper[4823]: E1216 06:59:46.670748 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" containerName="installer" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.670777 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" containerName="installer" Dec 16 06:59:46 crc kubenswrapper[4823]: E1216 06:59:46.670803 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b58907-b6e9-4a6d-b442-9c79d839bac9" containerName="oauth-openshift" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.670812 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b58907-b6e9-4a6d-b442-9c79d839bac9" containerName="oauth-openshift" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.670945 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b58907-b6e9-4a6d-b442-9c79d839bac9" containerName="oauth-openshift" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.670965 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa03a70b-1afb-4d1b-8bec-7a302a382b7d" containerName="installer" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.671535 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.677594 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.677600 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.677739 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.677750 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.678158 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.678338 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.678444 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.678476 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.678879 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.678893 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.687741 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.687922 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.698872 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.701606 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.709938 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.720829 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.766632 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-error\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.766753 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b7c5267-dad3-4e0e-b043-f1f3db56db69-audit-dir\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.766787 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.766897 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.766969 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-session\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.766999 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767040 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-router-certs\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767062 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767083 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767108 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767128 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-service-ca\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767169 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-audit-policies\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767194 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-login\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.767216 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxwkk\" (UniqueName: \"kubernetes.io/projected/6b7c5267-dad3-4e0e-b043-f1f3db56db69-kube-api-access-qxwkk\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.830995 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868739 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-error\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868794 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b7c5267-dad3-4e0e-b043-f1f3db56db69-audit-dir\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868816 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868851 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868879 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868901 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-session\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868920 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-router-certs\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868946 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868963 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.868987 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.869008 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-service-ca\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.869065 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-audit-policies\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.869093 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-login\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.869111 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxwkk\" (UniqueName: \"kubernetes.io/projected/6b7c5267-dad3-4e0e-b043-f1f3db56db69-kube-api-access-qxwkk\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.869332 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b7c5267-dad3-4e0e-b043-f1f3db56db69-audit-dir\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.870632 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-audit-policies\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.870663 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.870970 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.871636 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-service-ca\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.877602 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.878423 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.878468 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.880808 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-error\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.881898 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.884908 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.886537 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.889586 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-router-certs\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.899725 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxwkk\" (UniqueName: \"kubernetes.io/projected/6b7c5267-dad3-4e0e-b043-f1f3db56db69-kube-api-access-qxwkk\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.900744 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-user-template-login\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.907621 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b7c5267-dad3-4e0e-b043-f1f3db56db69-v4-0-config-system-session\") pod \"oauth-openshift-55889b984c-zdsfp\" (UID: \"6b7c5267-dad3-4e0e-b043-f1f3db56db69\") " pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.908000 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.984889 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 06:59:46 crc kubenswrapper[4823]: I1216 06:59:46.991785 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.017733 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.062938 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.063608 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.065381 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.072400 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.085801 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.172956 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.182049 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.295253 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.350901 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.532812 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.587627 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.673958 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.702666 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.702815 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.735868 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.751789 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 06:59:47 crc kubenswrapper[4823]: I1216 06:59:47.855307 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.185018 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.235107 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.274754 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.339653 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.371688 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.478121 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.524893 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.588173 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.596838 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.635463 4823 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.683266 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.739447 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.763623 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.935688 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 06:59:48 crc kubenswrapper[4823]: I1216 06:59:48.957318 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.019284 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.044250 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.104890 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.186874 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.221205 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.286169 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.294175 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.297654 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.338908 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.380394 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.471318 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.493219 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.610614 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.642291 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.717996 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 06:59:49 crc kubenswrapper[4823]: I1216 06:59:49.926129 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.037460 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.061167 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.068181 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.114278 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.174308 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.215346 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.276394 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.357814 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.373800 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.425873 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.490565 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.527238 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.572260 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.597807 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.654321 4823 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.678816 4823 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.679096 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366" gracePeriod=5 Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.707453 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 06:59:50 crc kubenswrapper[4823]: I1216 06:59:50.892790 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.080129 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.116419 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.135273 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.180282 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.214342 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.246619 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.459550 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.463519 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.479363 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.562269 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.583630 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.665063 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.695750 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.745044 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.873975 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.876497 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 06:59:51 crc kubenswrapper[4823]: I1216 06:59:51.974319 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.009631 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.196458 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.197485 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.356109 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.434327 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.516960 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.537773 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.552233 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.603245 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.624191 4823 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.776385 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.795679 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 06:59:52 crc kubenswrapper[4823]: I1216 06:59:52.828242 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.001188 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.009400 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.090211 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.135923 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.260700 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.385041 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.425790 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.431626 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.730261 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.845773 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.868897 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 06:59:53 crc kubenswrapper[4823]: I1216 06:59:53.995804 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.061753 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.153679 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.155855 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.213563 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.318956 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.329159 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.429262 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.461153 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.570165 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.612655 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.616397 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.632070 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.644227 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55889b984c-zdsfp"] Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.694087 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.725546 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.746447 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.761558 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 06:59:54 crc kubenswrapper[4823]: I1216 06:59:54.978485 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:54.999292 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.002081 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.006774 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.104947 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 06:59:55 crc kubenswrapper[4823]: E1216 06:59:55.117271 4823 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 16 06:59:55 crc kubenswrapper[4823]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55889b984c-zdsfp_openshift-authentication_6b7c5267-dad3-4e0e-b043-f1f3db56db69_0(86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083): error adding pod openshift-authentication_oauth-openshift-55889b984c-zdsfp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083" Netns:"/var/run/netns/e1728fd5-04db-4135-a27d-2091969a5015" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55889b984c-zdsfp;K8S_POD_INFRA_CONTAINER_ID=86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083;K8S_POD_UID=6b7c5267-dad3-4e0e-b043-f1f3db56db69" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55889b984c-zdsfp] networking: Multus: [openshift-authentication/oauth-openshift-55889b984c-zdsfp/6b7c5267-dad3-4e0e-b043-f1f3db56db69]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-55889b984c-zdsfp in out of cluster comm: pod "oauth-openshift-55889b984c-zdsfp" not found Dec 16 06:59:55 crc kubenswrapper[4823]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 16 06:59:55 crc kubenswrapper[4823]: > Dec 16 06:59:55 crc kubenswrapper[4823]: E1216 06:59:55.117420 4823 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 16 06:59:55 crc kubenswrapper[4823]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55889b984c-zdsfp_openshift-authentication_6b7c5267-dad3-4e0e-b043-f1f3db56db69_0(86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083): error adding pod openshift-authentication_oauth-openshift-55889b984c-zdsfp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083" Netns:"/var/run/netns/e1728fd5-04db-4135-a27d-2091969a5015" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55889b984c-zdsfp;K8S_POD_INFRA_CONTAINER_ID=86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083;K8S_POD_UID=6b7c5267-dad3-4e0e-b043-f1f3db56db69" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55889b984c-zdsfp] networking: Multus: [openshift-authentication/oauth-openshift-55889b984c-zdsfp/6b7c5267-dad3-4e0e-b043-f1f3db56db69]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-55889b984c-zdsfp in out of cluster comm: pod "oauth-openshift-55889b984c-zdsfp" not found Dec 16 06:59:55 crc kubenswrapper[4823]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 16 06:59:55 crc kubenswrapper[4823]: > pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:55 crc kubenswrapper[4823]: E1216 06:59:55.117451 4823 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 16 06:59:55 crc kubenswrapper[4823]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55889b984c-zdsfp_openshift-authentication_6b7c5267-dad3-4e0e-b043-f1f3db56db69_0(86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083): error adding pod openshift-authentication_oauth-openshift-55889b984c-zdsfp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083" Netns:"/var/run/netns/e1728fd5-04db-4135-a27d-2091969a5015" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55889b984c-zdsfp;K8S_POD_INFRA_CONTAINER_ID=86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083;K8S_POD_UID=6b7c5267-dad3-4e0e-b043-f1f3db56db69" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55889b984c-zdsfp] networking: Multus: [openshift-authentication/oauth-openshift-55889b984c-zdsfp/6b7c5267-dad3-4e0e-b043-f1f3db56db69]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-55889b984c-zdsfp in out of cluster comm: pod "oauth-openshift-55889b984c-zdsfp" not found Dec 16 06:59:55 crc kubenswrapper[4823]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 16 06:59:55 crc kubenswrapper[4823]: > pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:55 crc kubenswrapper[4823]: E1216 06:59:55.117561 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-55889b984c-zdsfp_openshift-authentication(6b7c5267-dad3-4e0e-b043-f1f3db56db69)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-55889b984c-zdsfp_openshift-authentication(6b7c5267-dad3-4e0e-b043-f1f3db56db69)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-55889b984c-zdsfp_openshift-authentication_6b7c5267-dad3-4e0e-b043-f1f3db56db69_0(86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083): error adding pod openshift-authentication_oauth-openshift-55889b984c-zdsfp to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083\\\" Netns:\\\"/var/run/netns/e1728fd5-04db-4135-a27d-2091969a5015\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-55889b984c-zdsfp;K8S_POD_INFRA_CONTAINER_ID=86afe16873385f81bc3f305e735c8131422b4a99d7896e7b072000789d3a0083;K8S_POD_UID=6b7c5267-dad3-4e0e-b043-f1f3db56db69\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-55889b984c-zdsfp] networking: Multus: [openshift-authentication/oauth-openshift-55889b984c-zdsfp/6b7c5267-dad3-4e0e-b043-f1f3db56db69]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-55889b984c-zdsfp in out of cluster comm: pod \\\"oauth-openshift-55889b984c-zdsfp\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" podUID="6b7c5267-dad3-4e0e-b043-f1f3db56db69" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.196220 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.219238 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.257679 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.348802 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.497676 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.548702 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.549386 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.573872 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.651672 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 06:59:55 crc kubenswrapper[4823]: I1216 06:59:55.732578 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.286460 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.286573 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432040 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432101 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432261 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432225 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432311 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432396 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.432438 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.433133 4823 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.433157 4823 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.433171 4823 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.433155 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.443976 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.534619 4823 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.534666 4823 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.556706 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.556779 4823 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366" exitCode=137 Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.556844 4823 scope.go:117] "RemoveContainer" containerID="bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.556887 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.582466 4823 scope.go:117] "RemoveContainer" containerID="bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.582581 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 06:59:56 crc kubenswrapper[4823]: E1216 06:59:56.583302 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366\": container with ID starting with bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366 not found: ID does not exist" containerID="bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.583460 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366"} err="failed to get container status \"bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366\": rpc error: code = NotFound desc = could not find container \"bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366\": container with ID starting with bd36e2afeb0241905f24bb9c075785a72a957b030006c865b30a399c07a52366 not found: ID does not exist" Dec 16 06:59:56 crc kubenswrapper[4823]: I1216 06:59:56.955291 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.029512 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.492275 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.597754 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.621365 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55889b984c-zdsfp"] Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.638668 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.780152 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 16 06:59:57 crc kubenswrapper[4823]: I1216 06:59:57.862636 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 06:59:58 crc kubenswrapper[4823]: I1216 06:59:58.573932 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" event={"ID":"6b7c5267-dad3-4e0e-b043-f1f3db56db69","Type":"ContainerStarted","Data":"a5d2afc013b1fd611972ead84f0a1fe055677971bc4d84aa9ee4c687e3f70d5a"} Dec 16 06:59:58 crc kubenswrapper[4823]: I1216 06:59:58.574014 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" event={"ID":"6b7c5267-dad3-4e0e-b043-f1f3db56db69","Type":"ContainerStarted","Data":"56a93f042b2f683cdbfbc8ad190772597b21c5a8269055c4bdd4345b7ab791fc"} Dec 16 06:59:58 crc kubenswrapper[4823]: I1216 06:59:58.574319 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:58 crc kubenswrapper[4823]: I1216 06:59:58.582542 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" Dec 16 06:59:58 crc kubenswrapper[4823]: I1216 06:59:58.619290 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55889b984c-zdsfp" podStartSLOduration=52.619245372 podStartE2EDuration="52.619245372s" podCreationTimestamp="2025-12-16 06:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:59:58.613479902 +0000 UTC m=+277.102046085" watchObservedRunningTime="2025-12-16 06:59:58.619245372 +0000 UTC m=+277.107811535" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.186662 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd"] Dec 16 07:00:00 crc kubenswrapper[4823]: E1216 07:00:00.186947 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.186990 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.187116 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.187665 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.191739 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.191803 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.198186 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd"] Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.292720 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdsfs\" (UniqueName: \"kubernetes.io/projected/fb000cb4-8b05-4ee9-b6dd-c8797099232b-kube-api-access-vdsfs\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.293304 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb000cb4-8b05-4ee9-b6dd-c8797099232b-secret-volume\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.293372 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb000cb4-8b05-4ee9-b6dd-c8797099232b-config-volume\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.394493 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdsfs\" (UniqueName: \"kubernetes.io/projected/fb000cb4-8b05-4ee9-b6dd-c8797099232b-kube-api-access-vdsfs\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.394596 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb000cb4-8b05-4ee9-b6dd-c8797099232b-secret-volume\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.394665 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb000cb4-8b05-4ee9-b6dd-c8797099232b-config-volume\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.396207 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb000cb4-8b05-4ee9-b6dd-c8797099232b-config-volume\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.404506 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb000cb4-8b05-4ee9-b6dd-c8797099232b-secret-volume\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.413965 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdsfs\" (UniqueName: \"kubernetes.io/projected/fb000cb4-8b05-4ee9-b6dd-c8797099232b-kube-api-access-vdsfs\") pod \"collect-profiles-29431140-v9vtd\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.508661 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:00 crc kubenswrapper[4823]: I1216 07:00:00.704660 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd"] Dec 16 07:00:01 crc kubenswrapper[4823]: I1216 07:00:01.609006 4823 generic.go:334] "Generic (PLEG): container finished" podID="fb000cb4-8b05-4ee9-b6dd-c8797099232b" containerID="12be26a45e3fc2e61b7ddf0faab639b15d425610e1a28f68a3617f4f7dfa5a3a" exitCode=0 Dec 16 07:00:01 crc kubenswrapper[4823]: I1216 07:00:01.609098 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" event={"ID":"fb000cb4-8b05-4ee9-b6dd-c8797099232b","Type":"ContainerDied","Data":"12be26a45e3fc2e61b7ddf0faab639b15d425610e1a28f68a3617f4f7dfa5a3a"} Dec 16 07:00:01 crc kubenswrapper[4823]: I1216 07:00:01.609146 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" event={"ID":"fb000cb4-8b05-4ee9-b6dd-c8797099232b","Type":"ContainerStarted","Data":"d32d78dc5bc6797b67e691937211489a79bf3bb4047ee1d1e640bb1bb323386c"} Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.841181 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.940176 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdsfs\" (UniqueName: \"kubernetes.io/projected/fb000cb4-8b05-4ee9-b6dd-c8797099232b-kube-api-access-vdsfs\") pod \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.940229 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb000cb4-8b05-4ee9-b6dd-c8797099232b-config-volume\") pod \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.940310 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb000cb4-8b05-4ee9-b6dd-c8797099232b-secret-volume\") pod \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\" (UID: \"fb000cb4-8b05-4ee9-b6dd-c8797099232b\") " Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.941108 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb000cb4-8b05-4ee9-b6dd-c8797099232b-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb000cb4-8b05-4ee9-b6dd-c8797099232b" (UID: "fb000cb4-8b05-4ee9-b6dd-c8797099232b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.946847 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb000cb4-8b05-4ee9-b6dd-c8797099232b-kube-api-access-vdsfs" (OuterVolumeSpecName: "kube-api-access-vdsfs") pod "fb000cb4-8b05-4ee9-b6dd-c8797099232b" (UID: "fb000cb4-8b05-4ee9-b6dd-c8797099232b"). InnerVolumeSpecName "kube-api-access-vdsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:00:02 crc kubenswrapper[4823]: I1216 07:00:02.946906 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb000cb4-8b05-4ee9-b6dd-c8797099232b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb000cb4-8b05-4ee9-b6dd-c8797099232b" (UID: "fb000cb4-8b05-4ee9-b6dd-c8797099232b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:00:03 crc kubenswrapper[4823]: I1216 07:00:03.042180 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb000cb4-8b05-4ee9-b6dd-c8797099232b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:03 crc kubenswrapper[4823]: I1216 07:00:03.042419 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb000cb4-8b05-4ee9-b6dd-c8797099232b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:03 crc kubenswrapper[4823]: I1216 07:00:03.042430 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdsfs\" (UniqueName: \"kubernetes.io/projected/fb000cb4-8b05-4ee9-b6dd-c8797099232b-kube-api-access-vdsfs\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:03 crc kubenswrapper[4823]: I1216 07:00:03.624832 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" event={"ID":"fb000cb4-8b05-4ee9-b6dd-c8797099232b","Type":"ContainerDied","Data":"d32d78dc5bc6797b67e691937211489a79bf3bb4047ee1d1e640bb1bb323386c"} Dec 16 07:00:03 crc kubenswrapper[4823]: I1216 07:00:03.624907 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d32d78dc5bc6797b67e691937211489a79bf3bb4047ee1d1e640bb1bb323386c" Dec 16 07:00:03 crc kubenswrapper[4823]: I1216 07:00:03.624983 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd" Dec 16 07:00:13 crc kubenswrapper[4823]: I1216 07:00:13.687583 4823 generic.go:334] "Generic (PLEG): container finished" podID="dca532ee-e66a-411a-afcc-646f96a22a62" containerID="2d358bd6f6c0e8e78ad8d528e42077f33fdee2245475689910ade600668ec0c7" exitCode=0 Dec 16 07:00:13 crc kubenswrapper[4823]: I1216 07:00:13.687673 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" event={"ID":"dca532ee-e66a-411a-afcc-646f96a22a62","Type":"ContainerDied","Data":"2d358bd6f6c0e8e78ad8d528e42077f33fdee2245475689910ade600668ec0c7"} Dec 16 07:00:13 crc kubenswrapper[4823]: I1216 07:00:13.688876 4823 scope.go:117] "RemoveContainer" containerID="2d358bd6f6c0e8e78ad8d528e42077f33fdee2245475689910ade600668ec0c7" Dec 16 07:00:14 crc kubenswrapper[4823]: I1216 07:00:14.697015 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" event={"ID":"dca532ee-e66a-411a-afcc-646f96a22a62","Type":"ContainerStarted","Data":"ac61fd69402f631638df30793e222270a5947ed323cb9551aba3e12a30f6fc8d"} Dec 16 07:00:14 crc kubenswrapper[4823]: I1216 07:00:14.698052 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 07:00:14 crc kubenswrapper[4823]: I1216 07:00:14.703313 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 07:00:17 crc kubenswrapper[4823]: I1216 07:00:17.869410 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vmqj6"] Dec 16 07:00:17 crc kubenswrapper[4823]: I1216 07:00:17.870254 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" podUID="8aaa63b4-9b41-442f-b9ea-672885a486bd" containerName="controller-manager" containerID="cri-o://d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8" gracePeriod=30 Dec 16 07:00:17 crc kubenswrapper[4823]: I1216 07:00:17.969236 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh"] Dec 16 07:00:17 crc kubenswrapper[4823]: I1216 07:00:17.969579 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" podUID="2dea4f36-2ae5-4363-a65c-0b7346f02661" containerName="route-controller-manager" containerID="cri-o://233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d" gracePeriod=30 Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.409772 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.465524 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.487376 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8aaa63b4-9b41-442f-b9ea-672885a486bd-serving-cert\") pod \"8aaa63b4-9b41-442f-b9ea-672885a486bd\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.487436 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-client-ca\") pod \"8aaa63b4-9b41-442f-b9ea-672885a486bd\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.487498 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9w6b\" (UniqueName: \"kubernetes.io/projected/8aaa63b4-9b41-442f-b9ea-672885a486bd-kube-api-access-h9w6b\") pod \"8aaa63b4-9b41-442f-b9ea-672885a486bd\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.487610 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-proxy-ca-bundles\") pod \"8aaa63b4-9b41-442f-b9ea-672885a486bd\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.487639 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-config\") pod \"8aaa63b4-9b41-442f-b9ea-672885a486bd\" (UID: \"8aaa63b4-9b41-442f-b9ea-672885a486bd\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.489083 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-config" (OuterVolumeSpecName: "config") pod "8aaa63b4-9b41-442f-b9ea-672885a486bd" (UID: "8aaa63b4-9b41-442f-b9ea-672885a486bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.489553 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "8aaa63b4-9b41-442f-b9ea-672885a486bd" (UID: "8aaa63b4-9b41-442f-b9ea-672885a486bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.490165 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8aaa63b4-9b41-442f-b9ea-672885a486bd" (UID: "8aaa63b4-9b41-442f-b9ea-672885a486bd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.496379 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aaa63b4-9b41-442f-b9ea-672885a486bd-kube-api-access-h9w6b" (OuterVolumeSpecName: "kube-api-access-h9w6b") pod "8aaa63b4-9b41-442f-b9ea-672885a486bd" (UID: "8aaa63b4-9b41-442f-b9ea-672885a486bd"). InnerVolumeSpecName "kube-api-access-h9w6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.496416 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aaa63b4-9b41-442f-b9ea-672885a486bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8aaa63b4-9b41-442f-b9ea-672885a486bd" (UID: "8aaa63b4-9b41-442f-b9ea-672885a486bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.588646 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea4f36-2ae5-4363-a65c-0b7346f02661-serving-cert\") pod \"2dea4f36-2ae5-4363-a65c-0b7346f02661\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.588774 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-config\") pod \"2dea4f36-2ae5-4363-a65c-0b7346f02661\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.588867 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ws45\" (UniqueName: \"kubernetes.io/projected/2dea4f36-2ae5-4363-a65c-0b7346f02661-kube-api-access-4ws45\") pod \"2dea4f36-2ae5-4363-a65c-0b7346f02661\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.588891 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-client-ca\") pod \"2dea4f36-2ae5-4363-a65c-0b7346f02661\" (UID: \"2dea4f36-2ae5-4363-a65c-0b7346f02661\") " Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589122 4823 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589135 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589144 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8aaa63b4-9b41-442f-b9ea-672885a486bd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589152 4823 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8aaa63b4-9b41-442f-b9ea-672885a486bd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589161 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9w6b\" (UniqueName: \"kubernetes.io/projected/8aaa63b4-9b41-442f-b9ea-672885a486bd-kube-api-access-h9w6b\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589843 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-client-ca" (OuterVolumeSpecName: "client-ca") pod "2dea4f36-2ae5-4363-a65c-0b7346f02661" (UID: "2dea4f36-2ae5-4363-a65c-0b7346f02661"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.589880 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-config" (OuterVolumeSpecName: "config") pod "2dea4f36-2ae5-4363-a65c-0b7346f02661" (UID: "2dea4f36-2ae5-4363-a65c-0b7346f02661"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.594035 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dea4f36-2ae5-4363-a65c-0b7346f02661-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2dea4f36-2ae5-4363-a65c-0b7346f02661" (UID: "2dea4f36-2ae5-4363-a65c-0b7346f02661"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.594083 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dea4f36-2ae5-4363-a65c-0b7346f02661-kube-api-access-4ws45" (OuterVolumeSpecName: "kube-api-access-4ws45") pod "2dea4f36-2ae5-4363-a65c-0b7346f02661" (UID: "2dea4f36-2ae5-4363-a65c-0b7346f02661"). InnerVolumeSpecName "kube-api-access-4ws45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.690929 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ws45\" (UniqueName: \"kubernetes.io/projected/2dea4f36-2ae5-4363-a65c-0b7346f02661-kube-api-access-4ws45\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.690982 4823 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.690997 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea4f36-2ae5-4363-a65c-0b7346f02661-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.691015 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea4f36-2ae5-4363-a65c-0b7346f02661-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.721946 4823 generic.go:334] "Generic (PLEG): container finished" podID="8aaa63b4-9b41-442f-b9ea-672885a486bd" containerID="d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8" exitCode=0 Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.722055 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" event={"ID":"8aaa63b4-9b41-442f-b9ea-672885a486bd","Type":"ContainerDied","Data":"d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8"} Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.722092 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" event={"ID":"8aaa63b4-9b41-442f-b9ea-672885a486bd","Type":"ContainerDied","Data":"8797894f3a0c32ef8d0aaacde20ea2b04c1c9017f1fd9a184b071121941c3b9b"} Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.722111 4823 scope.go:117] "RemoveContainer" containerID="d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.722247 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vmqj6" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.727350 4823 generic.go:334] "Generic (PLEG): container finished" podID="2dea4f36-2ae5-4363-a65c-0b7346f02661" containerID="233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d" exitCode=0 Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.727380 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" event={"ID":"2dea4f36-2ae5-4363-a65c-0b7346f02661","Type":"ContainerDied","Data":"233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d"} Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.727401 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" event={"ID":"2dea4f36-2ae5-4363-a65c-0b7346f02661","Type":"ContainerDied","Data":"c15ceb2510f7cae18f2baf442ccc0b6f0266ade5b99fa67a6094fb205da607b6"} Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.727493 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.744238 4823 scope.go:117] "RemoveContainer" containerID="d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8" Dec 16 07:00:18 crc kubenswrapper[4823]: E1216 07:00:18.744919 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8\": container with ID starting with d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8 not found: ID does not exist" containerID="d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.745041 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8"} err="failed to get container status \"d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8\": rpc error: code = NotFound desc = could not find container \"d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8\": container with ID starting with d9fb43391b95eb85ec0a1c9227538d6a4d9cdb8d0245123f19590848e7efefb8 not found: ID does not exist" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.745088 4823 scope.go:117] "RemoveContainer" containerID="233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.761401 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vmqj6"] Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.763175 4823 scope.go:117] "RemoveContainer" containerID="233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d" Dec 16 07:00:18 crc kubenswrapper[4823]: E1216 07:00:18.764068 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d\": container with ID starting with 233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d not found: ID does not exist" containerID="233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.764296 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d"} err="failed to get container status \"233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d\": rpc error: code = NotFound desc = could not find container \"233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d\": container with ID starting with 233be53319aaf06d5c772b9fdc34ca8a3943eb013859ee9f72f645896163fc7d not found: ID does not exist" Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.771736 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vmqj6"] Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.800896 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh"] Dec 16 07:00:18 crc kubenswrapper[4823]: I1216 07:00:18.812258 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-plnfh"] Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.609288 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j"] Dec 16 07:00:19 crc kubenswrapper[4823]: E1216 07:00:19.610073 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dea4f36-2ae5-4363-a65c-0b7346f02661" containerName="route-controller-manager" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610091 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dea4f36-2ae5-4363-a65c-0b7346f02661" containerName="route-controller-manager" Dec 16 07:00:19 crc kubenswrapper[4823]: E1216 07:00:19.610100 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aaa63b4-9b41-442f-b9ea-672885a486bd" containerName="controller-manager" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610107 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aaa63b4-9b41-442f-b9ea-672885a486bd" containerName="controller-manager" Dec 16 07:00:19 crc kubenswrapper[4823]: E1216 07:00:19.610124 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb000cb4-8b05-4ee9-b6dd-c8797099232b" containerName="collect-profiles" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610131 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb000cb4-8b05-4ee9-b6dd-c8797099232b" containerName="collect-profiles" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610271 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aaa63b4-9b41-442f-b9ea-672885a486bd" containerName="controller-manager" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610293 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb000cb4-8b05-4ee9-b6dd-c8797099232b" containerName="collect-profiles" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610303 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dea4f36-2ae5-4363-a65c-0b7346f02661" containerName="route-controller-manager" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.610782 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.618744 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.618791 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.618985 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.619101 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-tc5cz"] Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.619189 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.619306 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.619884 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.621770 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.623212 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.623704 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.624420 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.625061 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.625157 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.625729 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.633511 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j"] Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.637789 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.650136 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-tc5cz"] Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.705919 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-config\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706014 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xpc9\" (UniqueName: \"kubernetes.io/projected/797b30e3-f538-443c-a0c7-af5ccdd1b120-kube-api-access-9xpc9\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706080 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-config\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706289 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797b30e3-f538-443c-a0c7-af5ccdd1b120-serving-cert\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706360 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-serving-cert\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706453 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-proxy-ca-bundles\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706529 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-client-ca\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706660 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-client-ca\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.706720 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcscc\" (UniqueName: \"kubernetes.io/projected/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-kube-api-access-jcscc\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.780337 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dea4f36-2ae5-4363-a65c-0b7346f02661" path="/var/lib/kubelet/pods/2dea4f36-2ae5-4363-a65c-0b7346f02661/volumes" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.780903 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aaa63b4-9b41-442f-b9ea-672885a486bd" path="/var/lib/kubelet/pods/8aaa63b4-9b41-442f-b9ea-672885a486bd/volumes" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.807795 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-config\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.807897 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xpc9\" (UniqueName: \"kubernetes.io/projected/797b30e3-f538-443c-a0c7-af5ccdd1b120-kube-api-access-9xpc9\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.807943 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-config\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.807991 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797b30e3-f538-443c-a0c7-af5ccdd1b120-serving-cert\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.808012 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-serving-cert\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.808065 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-proxy-ca-bundles\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.808110 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-client-ca\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.808156 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-client-ca\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.808182 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcscc\" (UniqueName: \"kubernetes.io/projected/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-kube-api-access-jcscc\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.809594 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-config\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.810019 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-client-ca\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.810254 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-config\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.810335 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-proxy-ca-bundles\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.811785 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-client-ca\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.816581 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797b30e3-f538-443c-a0c7-af5ccdd1b120-serving-cert\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.821130 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-serving-cert\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.836925 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xpc9\" (UniqueName: \"kubernetes.io/projected/797b30e3-f538-443c-a0c7-af5ccdd1b120-kube-api-access-9xpc9\") pod \"controller-manager-6c5c8764-tc5cz\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.844902 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcscc\" (UniqueName: \"kubernetes.io/projected/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-kube-api-access-jcscc\") pod \"route-controller-manager-6856fbf746-bpv9j\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.931398 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:19 crc kubenswrapper[4823]: I1216 07:00:19.940205 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.206281 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-tc5cz"] Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.266940 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j"] Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.744550 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" event={"ID":"797b30e3-f538-443c-a0c7-af5ccdd1b120","Type":"ContainerStarted","Data":"344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1"} Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.745117 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.745139 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" event={"ID":"797b30e3-f538-443c-a0c7-af5ccdd1b120","Type":"ContainerStarted","Data":"31cb08a14c8824a61d85c270eca8119769525ee5f2d7a1af54b65cd889e9109c"} Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.747146 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" event={"ID":"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f","Type":"ContainerStarted","Data":"3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de"} Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.747207 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" event={"ID":"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f","Type":"ContainerStarted","Data":"48b0ccc468bfe4d35fbf00deafc346108d960d3ef9652133debea329d40ca4ff"} Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.748191 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.753388 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.776580 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" podStartSLOduration=2.776549149 podStartE2EDuration="2.776549149s" podCreationTimestamp="2025-12-16 07:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:00:20.774843938 +0000 UTC m=+299.263410061" watchObservedRunningTime="2025-12-16 07:00:20.776549149 +0000 UTC m=+299.265115282" Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.860683 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" podStartSLOduration=2.860660267 podStartE2EDuration="2.860660267s" podCreationTimestamp="2025-12-16 07:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:00:20.854277469 +0000 UTC m=+299.342843592" watchObservedRunningTime="2025-12-16 07:00:20.860660267 +0000 UTC m=+299.349226390" Dec 16 07:00:20 crc kubenswrapper[4823]: I1216 07:00:20.952739 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:57 crc kubenswrapper[4823]: I1216 07:00:57.840382 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-tc5cz"] Dec 16 07:00:57 crc kubenswrapper[4823]: I1216 07:00:57.841491 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" podUID="797b30e3-f538-443c-a0c7-af5ccdd1b120" containerName="controller-manager" containerID="cri-o://344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1" gracePeriod=30 Dec 16 07:00:57 crc kubenswrapper[4823]: I1216 07:00:57.868344 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j"] Dec 16 07:00:57 crc kubenswrapper[4823]: I1216 07:00:57.868677 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" podUID="b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" containerName="route-controller-manager" containerID="cri-o://3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de" gracePeriod=30 Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.234716 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.305136 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.385255 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-config\") pod \"797b30e3-f538-443c-a0c7-af5ccdd1b120\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.385311 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-proxy-ca-bundles\") pod \"797b30e3-f538-443c-a0c7-af5ccdd1b120\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.385412 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-client-ca\") pod \"797b30e3-f538-443c-a0c7-af5ccdd1b120\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.385479 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797b30e3-f538-443c-a0c7-af5ccdd1b120-serving-cert\") pod \"797b30e3-f538-443c-a0c7-af5ccdd1b120\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.385543 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xpc9\" (UniqueName: \"kubernetes.io/projected/797b30e3-f538-443c-a0c7-af5ccdd1b120-kube-api-access-9xpc9\") pod \"797b30e3-f538-443c-a0c7-af5ccdd1b120\" (UID: \"797b30e3-f538-443c-a0c7-af5ccdd1b120\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.386298 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-client-ca" (OuterVolumeSpecName: "client-ca") pod "797b30e3-f538-443c-a0c7-af5ccdd1b120" (UID: "797b30e3-f538-443c-a0c7-af5ccdd1b120"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.386448 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "797b30e3-f538-443c-a0c7-af5ccdd1b120" (UID: "797b30e3-f538-443c-a0c7-af5ccdd1b120"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.386744 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-config" (OuterVolumeSpecName: "config") pod "797b30e3-f538-443c-a0c7-af5ccdd1b120" (UID: "797b30e3-f538-443c-a0c7-af5ccdd1b120"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.392001 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b30e3-f538-443c-a0c7-af5ccdd1b120-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "797b30e3-f538-443c-a0c7-af5ccdd1b120" (UID: "797b30e3-f538-443c-a0c7-af5ccdd1b120"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.392032 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b30e3-f538-443c-a0c7-af5ccdd1b120-kube-api-access-9xpc9" (OuterVolumeSpecName: "kube-api-access-9xpc9") pod "797b30e3-f538-443c-a0c7-af5ccdd1b120" (UID: "797b30e3-f538-443c-a0c7-af5ccdd1b120"). InnerVolumeSpecName "kube-api-access-9xpc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.486815 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcscc\" (UniqueName: \"kubernetes.io/projected/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-kube-api-access-jcscc\") pod \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.486974 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-serving-cert\") pod \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487074 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-config\") pod \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487188 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-client-ca\") pod \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\" (UID: \"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f\") " Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487547 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487573 4823 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487592 4823 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/797b30e3-f538-443c-a0c7-af5ccdd1b120-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487607 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797b30e3-f538-443c-a0c7-af5ccdd1b120-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.487623 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xpc9\" (UniqueName: \"kubernetes.io/projected/797b30e3-f538-443c-a0c7-af5ccdd1b120-kube-api-access-9xpc9\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.488426 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" (UID: "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.488647 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-config" (OuterVolumeSpecName: "config") pod "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" (UID: "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.490868 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-kube-api-access-jcscc" (OuterVolumeSpecName: "kube-api-access-jcscc") pod "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" (UID: "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f"). InnerVolumeSpecName "kube-api-access-jcscc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.490878 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" (UID: "b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.588543 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcscc\" (UniqueName: \"kubernetes.io/projected/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-kube-api-access-jcscc\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.588602 4823 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.588615 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.588625 4823 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.980172 4823 generic.go:334] "Generic (PLEG): container finished" podID="797b30e3-f538-443c-a0c7-af5ccdd1b120" containerID="344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1" exitCode=0 Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.980264 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.980262 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" event={"ID":"797b30e3-f538-443c-a0c7-af5ccdd1b120","Type":"ContainerDied","Data":"344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1"} Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.980382 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-tc5cz" event={"ID":"797b30e3-f538-443c-a0c7-af5ccdd1b120","Type":"ContainerDied","Data":"31cb08a14c8824a61d85c270eca8119769525ee5f2d7a1af54b65cd889e9109c"} Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.980404 4823 scope.go:117] "RemoveContainer" containerID="344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.981823 4823 generic.go:334] "Generic (PLEG): container finished" podID="b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" containerID="3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de" exitCode=0 Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.981868 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" event={"ID":"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f","Type":"ContainerDied","Data":"3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de"} Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.981902 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.981904 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j" event={"ID":"b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f","Type":"ContainerDied","Data":"48b0ccc468bfe4d35fbf00deafc346108d960d3ef9652133debea329d40ca4ff"} Dec 16 07:00:58 crc kubenswrapper[4823]: I1216 07:00:58.997875 4823 scope.go:117] "RemoveContainer" containerID="344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1" Dec 16 07:00:59 crc kubenswrapper[4823]: E1216 07:00:59.011549 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1\": container with ID starting with 344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1 not found: ID does not exist" containerID="344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.011625 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1"} err="failed to get container status \"344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1\": rpc error: code = NotFound desc = could not find container \"344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1\": container with ID starting with 344edf62872d17bca919576499ab7012a3e9093cb097b6d3b060a6bbd612e7d1 not found: ID does not exist" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.011681 4823 scope.go:117] "RemoveContainer" containerID="3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.018272 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.022295 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-bpv9j"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.038298 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-tc5cz"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.042202 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-tc5cz"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.043437 4823 scope.go:117] "RemoveContainer" containerID="3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de" Dec 16 07:00:59 crc kubenswrapper[4823]: E1216 07:00:59.043949 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de\": container with ID starting with 3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de not found: ID does not exist" containerID="3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.044183 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de"} err="failed to get container status \"3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de\": rpc error: code = NotFound desc = could not find container \"3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de\": container with ID starting with 3f27aec771626ccf8c7c967660c12b47e8ccc62c8daaae066f5b1a666c08b5de not found: ID does not exist" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.639726 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr"] Dec 16 07:00:59 crc kubenswrapper[4823]: E1216 07:00:59.640329 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797b30e3-f538-443c-a0c7-af5ccdd1b120" containerName="controller-manager" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.640345 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="797b30e3-f538-443c-a0c7-af5ccdd1b120" containerName="controller-manager" Dec 16 07:00:59 crc kubenswrapper[4823]: E1216 07:00:59.640368 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" containerName="route-controller-manager" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.640373 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" containerName="route-controller-manager" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.640495 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="797b30e3-f538-443c-a0c7-af5ccdd1b120" containerName="controller-manager" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.640507 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" containerName="route-controller-manager" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.640993 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.643704 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.643731 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.643847 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.643704 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.644250 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.645185 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.645715 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.649281 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.649688 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.650094 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.650185 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.650633 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.650895 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.651011 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.656108 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.656225 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.659411 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr"] Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.780423 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797b30e3-f538-443c-a0c7-af5ccdd1b120" path="/var/lib/kubelet/pods/797b30e3-f538-443c-a0c7-af5ccdd1b120/volumes" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.781174 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f" path="/var/lib/kubelet/pods/b41c9e5b-8caa-409f-8c9a-4db1b0e42f4f/volumes" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.802929 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819a28d4-4973-433b-90da-d0235e69bb0d-client-ca\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803123 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819a28d4-4973-433b-90da-d0235e69bb0d-serving-cert\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803164 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-proxy-ca-bundles\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803194 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-client-ca\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803242 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819a28d4-4973-433b-90da-d0235e69bb0d-config\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803416 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9j7c\" (UniqueName: \"kubernetes.io/projected/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-kube-api-access-p9j7c\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803556 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-config\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803783 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wnzg\" (UniqueName: \"kubernetes.io/projected/819a28d4-4973-433b-90da-d0235e69bb0d-kube-api-access-5wnzg\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.803848 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-serving-cert\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905480 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819a28d4-4973-433b-90da-d0235e69bb0d-client-ca\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905541 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819a28d4-4973-433b-90da-d0235e69bb0d-serving-cert\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905571 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-proxy-ca-bundles\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905597 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-client-ca\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905621 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819a28d4-4973-433b-90da-d0235e69bb0d-config\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905652 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9j7c\" (UniqueName: \"kubernetes.io/projected/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-kube-api-access-p9j7c\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905674 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-config\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905712 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wnzg\" (UniqueName: \"kubernetes.io/projected/819a28d4-4973-433b-90da-d0235e69bb0d-kube-api-access-5wnzg\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.905729 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-serving-cert\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.908384 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819a28d4-4973-433b-90da-d0235e69bb0d-config\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.908439 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-client-ca\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.908607 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-config\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.908665 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-proxy-ca-bundles\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.909237 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819a28d4-4973-433b-90da-d0235e69bb0d-client-ca\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.911990 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-serving-cert\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.913901 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819a28d4-4973-433b-90da-d0235e69bb0d-serving-cert\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.926536 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9j7c\" (UniqueName: \"kubernetes.io/projected/cd4a50ea-aae8-4d0a-995b-b3a32aabbd83-kube-api-access-p9j7c\") pod \"controller-manager-77d55c7d9c-fxzmr\" (UID: \"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83\") " pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.928663 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wnzg\" (UniqueName: \"kubernetes.io/projected/819a28d4-4973-433b-90da-d0235e69bb0d-kube-api-access-5wnzg\") pod \"route-controller-manager-7bc548b56d-x2smm\" (UID: \"819a28d4-4973-433b-90da-d0235e69bb0d\") " pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.963959 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:00:59 crc kubenswrapper[4823]: I1216 07:00:59.977746 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:01:00 crc kubenswrapper[4823]: I1216 07:01:00.361613 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm"] Dec 16 07:01:00 crc kubenswrapper[4823]: I1216 07:01:00.417140 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr"] Dec 16 07:01:00 crc kubenswrapper[4823]: W1216 07:01:00.421936 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4a50ea_aae8_4d0a_995b_b3a32aabbd83.slice/crio-4ab82a5f6acc70d7409fa54425043df79a81dc0d020f7c322a4fddd52b96a46b WatchSource:0}: Error finding container 4ab82a5f6acc70d7409fa54425043df79a81dc0d020f7c322a4fddd52b96a46b: Status 404 returned error can't find the container with id 4ab82a5f6acc70d7409fa54425043df79a81dc0d020f7c322a4fddd52b96a46b Dec 16 07:01:00 crc kubenswrapper[4823]: I1216 07:01:00.998294 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" event={"ID":"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83","Type":"ContainerStarted","Data":"e528d7404244c99078cc2aa19698c713959440b6df3ef9a6b10f41275b6830b3"} Dec 16 07:01:00 crc kubenswrapper[4823]: I1216 07:01:00.998743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" event={"ID":"cd4a50ea-aae8-4d0a-995b-b3a32aabbd83","Type":"ContainerStarted","Data":"4ab82a5f6acc70d7409fa54425043df79a81dc0d020f7c322a4fddd52b96a46b"} Dec 16 07:01:00 crc kubenswrapper[4823]: I1216 07:01:00.998763 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.002659 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" event={"ID":"819a28d4-4973-433b-90da-d0235e69bb0d","Type":"ContainerStarted","Data":"69968d36cda73d88866dcfd72a33d0c0856be99d83373b1f97a42b68d9aa2936"} Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.002749 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" event={"ID":"819a28d4-4973-433b-90da-d0235e69bb0d","Type":"ContainerStarted","Data":"9df79f9d3bbc329490d017be6c7001cd6b4fda19abde4e7ac9e24a3381721b3d"} Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.002922 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.007431 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.022360 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77d55c7d9c-fxzmr" podStartSLOduration=4.022332863 podStartE2EDuration="4.022332863s" podCreationTimestamp="2025-12-16 07:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:01:01.019185678 +0000 UTC m=+339.507751811" watchObservedRunningTime="2025-12-16 07:01:01.022332863 +0000 UTC m=+339.510898986" Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.049472 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" podStartSLOduration=4.049443599 podStartE2EDuration="4.049443599s" podCreationTimestamp="2025-12-16 07:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:01:01.044790549 +0000 UTC m=+339.533356662" watchObservedRunningTime="2025-12-16 07:01:01.049443599 +0000 UTC m=+339.538009722" Dec 16 07:01:01 crc kubenswrapper[4823]: I1216 07:01:01.278743 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bc548b56d-x2smm" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.433196 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w7fhp"] Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.435057 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.459733 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w7fhp"] Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568150 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568252 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkfkk\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-kube-api-access-fkfkk\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568308 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-bound-sa-token\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568325 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29c504a7-d523-47ef-a21e-bfc13454ff42-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568346 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29c504a7-d523-47ef-a21e-bfc13454ff42-registry-certificates\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568372 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29c504a7-d523-47ef-a21e-bfc13454ff42-trusted-ca\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568398 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29c504a7-d523-47ef-a21e-bfc13454ff42-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.568416 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-registry-tls\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.589016 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669506 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29c504a7-d523-47ef-a21e-bfc13454ff42-trusted-ca\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669567 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29c504a7-d523-47ef-a21e-bfc13454ff42-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669590 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-registry-tls\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669641 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkfkk\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-kube-api-access-fkfkk\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669701 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-bound-sa-token\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669722 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29c504a7-d523-47ef-a21e-bfc13454ff42-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.669748 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29c504a7-d523-47ef-a21e-bfc13454ff42-registry-certificates\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.670954 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29c504a7-d523-47ef-a21e-bfc13454ff42-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.670977 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29c504a7-d523-47ef-a21e-bfc13454ff42-trusted-ca\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.671229 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29c504a7-d523-47ef-a21e-bfc13454ff42-registry-certificates\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.676857 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-registry-tls\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.678696 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29c504a7-d523-47ef-a21e-bfc13454ff42-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.689457 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-bound-sa-token\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.689676 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkfkk\" (UniqueName: \"kubernetes.io/projected/29c504a7-d523-47ef-a21e-bfc13454ff42-kube-api-access-fkfkk\") pod \"image-registry-66df7c8f76-w7fhp\" (UID: \"29c504a7-d523-47ef-a21e-bfc13454ff42\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:17 crc kubenswrapper[4823]: I1216 07:01:17.753935 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:18 crc kubenswrapper[4823]: I1216 07:01:18.301550 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w7fhp"] Dec 16 07:01:18 crc kubenswrapper[4823]: W1216 07:01:18.306764 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c504a7_d523_47ef_a21e_bfc13454ff42.slice/crio-398ed81ec6f3aa048488c97835ec31fbcd037dc1bd1ea9251f54106d49c433ef WatchSource:0}: Error finding container 398ed81ec6f3aa048488c97835ec31fbcd037dc1bd1ea9251f54106d49c433ef: Status 404 returned error can't find the container with id 398ed81ec6f3aa048488c97835ec31fbcd037dc1bd1ea9251f54106d49c433ef Dec 16 07:01:19 crc kubenswrapper[4823]: I1216 07:01:19.120328 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" event={"ID":"29c504a7-d523-47ef-a21e-bfc13454ff42","Type":"ContainerStarted","Data":"b64efc620130aae1bd170ab20559f643f3ff5cb51aabf59d3e868ccd3145c5b8"} Dec 16 07:01:19 crc kubenswrapper[4823]: I1216 07:01:19.120805 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" event={"ID":"29c504a7-d523-47ef-a21e-bfc13454ff42","Type":"ContainerStarted","Data":"398ed81ec6f3aa048488c97835ec31fbcd037dc1bd1ea9251f54106d49c433ef"} Dec 16 07:01:19 crc kubenswrapper[4823]: I1216 07:01:19.120843 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:19 crc kubenswrapper[4823]: I1216 07:01:19.153861 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" podStartSLOduration=2.153822415 podStartE2EDuration="2.153822415s" podCreationTimestamp="2025-12-16 07:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:01:19.147707431 +0000 UTC m=+357.636273564" watchObservedRunningTime="2025-12-16 07:01:19.153822415 +0000 UTC m=+357.642388548" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.595010 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shg64"] Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.596408 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shg64" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="registry-server" containerID="cri-o://b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451" gracePeriod=30 Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.617390 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhp8l"] Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.617766 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhp8l" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="registry-server" containerID="cri-o://c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc" gracePeriod=30 Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.643462 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-thj57"] Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.644723 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" containerID="cri-o://ac61fd69402f631638df30793e222270a5947ed323cb9551aba3e12a30f6fc8d" gracePeriod=30 Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.670028 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x67m"] Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.684649 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5x67m" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="registry-server" containerID="cri-o://a34010f49d4634fa3e7a9d8c1e3ddb232092355af14f8a3d3f92fb6bb4779033" gracePeriod=30 Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.685622 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j5hk4"] Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.704419 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.709246 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2jtdw"] Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.709495 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2jtdw" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="registry-server" containerID="cri-o://d244bf80e24864179e3b869c5e18a7ed9bb9718ac4a051319b4f010b759cf384" gracePeriod=30 Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.719053 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j5hk4"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.744078 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451 is running failed: container process not found" containerID="b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.745019 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451 is running failed: container process not found" containerID="b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.745561 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451 is running failed: container process not found" containerID="b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.745662 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-shg64" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="registry-server" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.769230 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/105c5da3-e305-4f41-968c-19466291e660-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.769931 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/105c5da3-e305-4f41-968c-19466291e660-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.770247 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9rd\" (UniqueName: \"kubernetes.io/projected/105c5da3-e305-4f41-968c-19466291e660-kube-api-access-8c9rd\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.871528 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9rd\" (UniqueName: \"kubernetes.io/projected/105c5da3-e305-4f41-968c-19466291e660-kube-api-access-8c9rd\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.872494 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/105c5da3-e305-4f41-968c-19466291e660-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.872682 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/105c5da3-e305-4f41-968c-19466291e660-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.874285 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/105c5da3-e305-4f41-968c-19466291e660-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.881008 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/105c5da3-e305-4f41-968c-19466291e660-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.886910 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc is running failed: container process not found" containerID="c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.887843 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc is running failed: container process not found" containerID="c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.888239 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc is running failed: container process not found" containerID="c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:01:25 crc kubenswrapper[4823]: E1216 07:01:25.888331 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-hhp8l" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="registry-server" Dec 16 07:01:25 crc kubenswrapper[4823]: I1216 07:01:25.893540 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9rd\" (UniqueName: \"kubernetes.io/projected/105c5da3-e305-4f41-968c-19466291e660-kube-api-access-8c9rd\") pod \"marketplace-operator-79b997595-j5hk4\" (UID: \"105c5da3-e305-4f41-968c-19466291e660\") " pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.171050 4823 generic.go:334] "Generic (PLEG): container finished" podID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerID="d244bf80e24864179e3b869c5e18a7ed9bb9718ac4a051319b4f010b759cf384" exitCode=0 Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.171164 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerDied","Data":"d244bf80e24864179e3b869c5e18a7ed9bb9718ac4a051319b4f010b759cf384"} Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.172448 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.173821 4823 generic.go:334] "Generic (PLEG): container finished" podID="dca532ee-e66a-411a-afcc-646f96a22a62" containerID="ac61fd69402f631638df30793e222270a5947ed323cb9551aba3e12a30f6fc8d" exitCode=0 Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.173862 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" event={"ID":"dca532ee-e66a-411a-afcc-646f96a22a62","Type":"ContainerDied","Data":"ac61fd69402f631638df30793e222270a5947ed323cb9551aba3e12a30f6fc8d"} Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.173947 4823 scope.go:117] "RemoveContainer" containerID="2d358bd6f6c0e8e78ad8d528e42077f33fdee2245475689910ade600668ec0c7" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.176543 4823 generic.go:334] "Generic (PLEG): container finished" podID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerID="b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451" exitCode=0 Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.176568 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shg64" event={"ID":"726c1e86-0af2-45b0-bc89-af72df38eff8","Type":"ContainerDied","Data":"b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451"} Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.179638 4823 generic.go:334] "Generic (PLEG): container finished" podID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerID="c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc" exitCode=0 Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.179697 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerDied","Data":"c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc"} Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.188242 4823 generic.go:334] "Generic (PLEG): container finished" podID="7220fce6-80f1-4da7-9a90-f106616709ae" containerID="a34010f49d4634fa3e7a9d8c1e3ddb232092355af14f8a3d3f92fb6bb4779033" exitCode=0 Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.188302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x67m" event={"ID":"7220fce6-80f1-4da7-9a90-f106616709ae","Type":"ContainerDied","Data":"a34010f49d4634fa3e7a9d8c1e3ddb232092355af14f8a3d3f92fb6bb4779033"} Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.261547 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shg64" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.337509 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.370106 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.377359 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.379238 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-utilities\") pod \"7220fce6-80f1-4da7-9a90-f106616709ae\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.379291 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fg8h\" (UniqueName: \"kubernetes.io/projected/726c1e86-0af2-45b0-bc89-af72df38eff8-kube-api-access-8fg8h\") pod \"726c1e86-0af2-45b0-bc89-af72df38eff8\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.379353 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64ggh\" (UniqueName: \"kubernetes.io/projected/7220fce6-80f1-4da7-9a90-f106616709ae-kube-api-access-64ggh\") pod \"7220fce6-80f1-4da7-9a90-f106616709ae\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.379394 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-utilities\") pod \"726c1e86-0af2-45b0-bc89-af72df38eff8\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.379444 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-catalog-content\") pod \"726c1e86-0af2-45b0-bc89-af72df38eff8\" (UID: \"726c1e86-0af2-45b0-bc89-af72df38eff8\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.379535 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-catalog-content\") pod \"7220fce6-80f1-4da7-9a90-f106616709ae\" (UID: \"7220fce6-80f1-4da7-9a90-f106616709ae\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.380999 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-utilities" (OuterVolumeSpecName: "utilities") pod "7220fce6-80f1-4da7-9a90-f106616709ae" (UID: "7220fce6-80f1-4da7-9a90-f106616709ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.382703 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-utilities" (OuterVolumeSpecName: "utilities") pod "726c1e86-0af2-45b0-bc89-af72df38eff8" (UID: "726c1e86-0af2-45b0-bc89-af72df38eff8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.383945 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7220fce6-80f1-4da7-9a90-f106616709ae-kube-api-access-64ggh" (OuterVolumeSpecName: "kube-api-access-64ggh") pod "7220fce6-80f1-4da7-9a90-f106616709ae" (UID: "7220fce6-80f1-4da7-9a90-f106616709ae"). InnerVolumeSpecName "kube-api-access-64ggh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.384303 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726c1e86-0af2-45b0-bc89-af72df38eff8-kube-api-access-8fg8h" (OuterVolumeSpecName: "kube-api-access-8fg8h") pod "726c1e86-0af2-45b0-bc89-af72df38eff8" (UID: "726c1e86-0af2-45b0-bc89-af72df38eff8"). InnerVolumeSpecName "kube-api-access-8fg8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.409725 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.442372 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7220fce6-80f1-4da7-9a90-f106616709ae" (UID: "7220fce6-80f1-4da7-9a90-f106616709ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.451953 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "726c1e86-0af2-45b0-bc89-af72df38eff8" (UID: "726c1e86-0af2-45b0-bc89-af72df38eff8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480486 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-catalog-content\") pod \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480576 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blnkq\" (UniqueName: \"kubernetes.io/projected/dca532ee-e66a-411a-afcc-646f96a22a62-kube-api-access-blnkq\") pod \"dca532ee-e66a-411a-afcc-646f96a22a62\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480641 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-trusted-ca\") pod \"dca532ee-e66a-411a-afcc-646f96a22a62\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480681 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-operator-metrics\") pod \"dca532ee-e66a-411a-afcc-646f96a22a62\" (UID: \"dca532ee-e66a-411a-afcc-646f96a22a62\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480705 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6mmc\" (UniqueName: \"kubernetes.io/projected/dfaef15c-ea70-4287-bf78-7e99f0fd0626-kube-api-access-b6mmc\") pod \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480733 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-catalog-content\") pod \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480799 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-utilities\") pod \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\" (UID: \"dfaef15c-ea70-4287-bf78-7e99f0fd0626\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480827 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-utilities\") pod \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.480854 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ppqn\" (UniqueName: \"kubernetes.io/projected/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-kube-api-access-6ppqn\") pod \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\" (UID: \"ca6b042f-7b3a-4204-90a8-d6a2c29fd271\") " Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481114 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481127 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220fce6-80f1-4da7-9a90-f106616709ae-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481136 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fg8h\" (UniqueName: \"kubernetes.io/projected/726c1e86-0af2-45b0-bc89-af72df38eff8-kube-api-access-8fg8h\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481147 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64ggh\" (UniqueName: \"kubernetes.io/projected/7220fce6-80f1-4da7-9a90-f106616709ae-kube-api-access-64ggh\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481155 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481164 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/726c1e86-0af2-45b0-bc89-af72df38eff8-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.481833 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "dca532ee-e66a-411a-afcc-646f96a22a62" (UID: "dca532ee-e66a-411a-afcc-646f96a22a62"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.483149 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-utilities" (OuterVolumeSpecName: "utilities") pod "dfaef15c-ea70-4287-bf78-7e99f0fd0626" (UID: "dfaef15c-ea70-4287-bf78-7e99f0fd0626"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.483933 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-utilities" (OuterVolumeSpecName: "utilities") pod "ca6b042f-7b3a-4204-90a8-d6a2c29fd271" (UID: "ca6b042f-7b3a-4204-90a8-d6a2c29fd271"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.485781 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca532ee-e66a-411a-afcc-646f96a22a62-kube-api-access-blnkq" (OuterVolumeSpecName: "kube-api-access-blnkq") pod "dca532ee-e66a-411a-afcc-646f96a22a62" (UID: "dca532ee-e66a-411a-afcc-646f96a22a62"). InnerVolumeSpecName "kube-api-access-blnkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.486416 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "dca532ee-e66a-411a-afcc-646f96a22a62" (UID: "dca532ee-e66a-411a-afcc-646f96a22a62"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.487331 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfaef15c-ea70-4287-bf78-7e99f0fd0626-kube-api-access-b6mmc" (OuterVolumeSpecName: "kube-api-access-b6mmc") pod "dfaef15c-ea70-4287-bf78-7e99f0fd0626" (UID: "dfaef15c-ea70-4287-bf78-7e99f0fd0626"). InnerVolumeSpecName "kube-api-access-b6mmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.487676 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-kube-api-access-6ppqn" (OuterVolumeSpecName: "kube-api-access-6ppqn") pod "ca6b042f-7b3a-4204-90a8-d6a2c29fd271" (UID: "ca6b042f-7b3a-4204-90a8-d6a2c29fd271"). InnerVolumeSpecName "kube-api-access-6ppqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.543075 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca6b042f-7b3a-4204-90a8-d6a2c29fd271" (UID: "ca6b042f-7b3a-4204-90a8-d6a2c29fd271"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582255 4823 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582294 4823 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dca532ee-e66a-411a-afcc-646f96a22a62-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582305 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6mmc\" (UniqueName: \"kubernetes.io/projected/dfaef15c-ea70-4287-bf78-7e99f0fd0626-kube-api-access-b6mmc\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582317 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582330 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582338 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582347 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ppqn\" (UniqueName: \"kubernetes.io/projected/ca6b042f-7b3a-4204-90a8-d6a2c29fd271-kube-api-access-6ppqn\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.582358 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blnkq\" (UniqueName: \"kubernetes.io/projected/dca532ee-e66a-411a-afcc-646f96a22a62-kube-api-access-blnkq\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.620345 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfaef15c-ea70-4287-bf78-7e99f0fd0626" (UID: "dfaef15c-ea70-4287-bf78-7e99f0fd0626"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.683963 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfaef15c-ea70-4287-bf78-7e99f0fd0626-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:01:26 crc kubenswrapper[4823]: I1216 07:01:26.690420 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j5hk4"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.194473 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" event={"ID":"105c5da3-e305-4f41-968c-19466291e660","Type":"ContainerStarted","Data":"6790b2a9d573cf3b769fda05936af3af9d35026ecfa5cbd4657fd4a548214e37"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.194547 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" event={"ID":"105c5da3-e305-4f41-968c-19466291e660","Type":"ContainerStarted","Data":"cbf330eb05bc36c9a46fba80a84029d2de80fa027e357c9e8f56cc5c7514e8aa"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.194702 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.196691 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" event={"ID":"dca532ee-e66a-411a-afcc-646f96a22a62","Type":"ContainerDied","Data":"827f50daeb452bc4e72dd43443df535abd767d609a2be17165da72edc21f52ab"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.196740 4823 scope.go:117] "RemoveContainer" containerID="ac61fd69402f631638df30793e222270a5947ed323cb9551aba3e12a30f6fc8d" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.196704 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-thj57" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.197929 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.200195 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shg64" event={"ID":"726c1e86-0af2-45b0-bc89-af72df38eff8","Type":"ContainerDied","Data":"72e74b472ba819377be581c2983f0ab4ebb75afeec2d7810f57b32c397c3e2d4"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.200282 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shg64" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.215625 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhp8l" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.215624 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhp8l" event={"ID":"ca6b042f-7b3a-4204-90a8-d6a2c29fd271","Type":"ContainerDied","Data":"6470b7a85105c21d08d83f55a8f594c5024f0886ea1ebda8fa08330d9a52a595"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.217376 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j5hk4" podStartSLOduration=2.216914343 podStartE2EDuration="2.216914343s" podCreationTimestamp="2025-12-16 07:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:01:27.215756899 +0000 UTC m=+365.704323072" watchObservedRunningTime="2025-12-16 07:01:27.216914343 +0000 UTC m=+365.705480476" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.223867 4823 scope.go:117] "RemoveContainer" containerID="b819c7fa993cb265b0407c30f4dc9ee842607e33640cd31326a01c69a6282451" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.228120 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x67m" event={"ID":"7220fce6-80f1-4da7-9a90-f106616709ae","Type":"ContainerDied","Data":"c8f683aceaef60e0c777286f257a90f2218cd1d4782db8f98de8d422b6e377cf"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.228963 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x67m" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.240528 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2jtdw" event={"ID":"dfaef15c-ea70-4287-bf78-7e99f0fd0626","Type":"ContainerDied","Data":"d568a9f1d174fc7d1f8cb49d0fe6abaafef83aa45f554f004b62d43bb5155cbd"} Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.240632 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2jtdw" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.278441 4823 scope.go:117] "RemoveContainer" containerID="4cfa95a0be96439067a0ce9b8781685fef6113e36ead9b993293993cdcdbf8d5" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.301255 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-thj57"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.303889 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-thj57"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.336203 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhp8l"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.337709 4823 scope.go:117] "RemoveContainer" containerID="3ec50673f7e7ef8e7eae49643e2ec546d3ef9485b6c7c61987a52e803118a429" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.338585 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhp8l"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.362861 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shg64"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.365832 4823 scope.go:117] "RemoveContainer" containerID="c6318e6e99455f620a46c95ed584b312dbf4dc6f97a4c7b09556b4bdec832edc" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.368429 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shg64"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.375553 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2jtdw"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.382241 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2jtdw"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.391460 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x67m"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.392819 4823 scope.go:117] "RemoveContainer" containerID="ceedf2c81a1224da174afd8fc392df551e5d91fc2406e342e80a98e7eea3fd1b" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.397582 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x67m"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.413092 4823 scope.go:117] "RemoveContainer" containerID="71d023451ad9f232f51c8bea4b92138bd63ce3588c3f86c74475eb36b93a867d" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.441612 4823 scope.go:117] "RemoveContainer" containerID="a34010f49d4634fa3e7a9d8c1e3ddb232092355af14f8a3d3f92fb6bb4779033" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.457578 4823 scope.go:117] "RemoveContainer" containerID="1130ff670ba478cd411804e58471dbf3533596ffc3a765ae7dfe735ca9c7ee15" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.471287 4823 scope.go:117] "RemoveContainer" containerID="979af8dd52b3e813e13be8d2fb702c3658663c32c080557a3e993767ce88614d" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.487757 4823 scope.go:117] "RemoveContainer" containerID="d244bf80e24864179e3b869c5e18a7ed9bb9718ac4a051319b4f010b759cf384" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.514459 4823 scope.go:117] "RemoveContainer" containerID="360f902d076aa1b5fe2bef2e3bb3cadf362e0101fc89fdb607cb555c8bd359a3" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.532461 4823 scope.go:117] "RemoveContainer" containerID="71620843587c34b5bf2f91b1d6da411bf4275fedd8c4728f8e223f33d9f89d7b" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.779240 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" path="/var/lib/kubelet/pods/7220fce6-80f1-4da7-9a90-f106616709ae/volumes" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.779988 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" path="/var/lib/kubelet/pods/726c1e86-0af2-45b0-bc89-af72df38eff8/volumes" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.780693 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" path="/var/lib/kubelet/pods/ca6b042f-7b3a-4204-90a8-d6a2c29fd271/volumes" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.782104 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" path="/var/lib/kubelet/pods/dca532ee-e66a-411a-afcc-646f96a22a62/volumes" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.782655 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" path="/var/lib/kubelet/pods/dfaef15c-ea70-4287-bf78-7e99f0fd0626/volumes" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.814743 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c8h86"] Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.814993 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815007 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815015 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815025 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815051 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815060 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815073 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815079 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815089 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815095 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815106 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815111 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815120 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815125 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815137 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815144 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815151 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815157 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815165 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815171 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="extract-utilities" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815177 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815183 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815193 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815200 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815209 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815215 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" Dec 16 07:01:27 crc kubenswrapper[4823]: E1216 07:01:27.815224 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815232 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="extract-content" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815341 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfaef15c-ea70-4287-bf78-7e99f0fd0626" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815356 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815365 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="726c1e86-0af2-45b0-bc89-af72df38eff8" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815374 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220fce6-80f1-4da7-9a90-f106616709ae" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815384 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca532ee-e66a-411a-afcc-646f96a22a62" containerName="marketplace-operator" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.815397 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6b042f-7b3a-4204-90a8-d6a2c29fd271" containerName="registry-server" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.816369 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.821451 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.830222 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8h86"] Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.916378 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnjg8\" (UniqueName: \"kubernetes.io/projected/d1a83968-f624-48a1-a47a-3ad405b3b53c-kube-api-access-pnjg8\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.916487 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a83968-f624-48a1-a47a-3ad405b3b53c-catalog-content\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:27 crc kubenswrapper[4823]: I1216 07:01:27.916529 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a83968-f624-48a1-a47a-3ad405b3b53c-utilities\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.015321 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wnh5n"] Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.017015 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.018499 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a83968-f624-48a1-a47a-3ad405b3b53c-catalog-content\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.018560 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a83968-f624-48a1-a47a-3ad405b3b53c-utilities\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.018604 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnjg8\" (UniqueName: \"kubernetes.io/projected/d1a83968-f624-48a1-a47a-3ad405b3b53c-kube-api-access-pnjg8\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.019017 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a83968-f624-48a1-a47a-3ad405b3b53c-catalog-content\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.019308 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.019749 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a83968-f624-48a1-a47a-3ad405b3b53c-utilities\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.033000 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wnh5n"] Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.046127 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnjg8\" (UniqueName: \"kubernetes.io/projected/d1a83968-f624-48a1-a47a-3ad405b3b53c-kube-api-access-pnjg8\") pod \"redhat-marketplace-c8h86\" (UID: \"d1a83968-f624-48a1-a47a-3ad405b3b53c\") " pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.119902 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-utilities\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.119968 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-catalog-content\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.120288 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-kube-api-access-swgrm\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.134288 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.134358 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.149263 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.221572 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-kube-api-access-swgrm\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.222216 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-utilities\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.222273 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-catalog-content\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.223017 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-utilities\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.223149 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-catalog-content\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.246247 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/7615bc49-d6d1-4933-8ef3-f7d871a8d4b8-kube-api-access-swgrm\") pod \"redhat-operators-wnh5n\" (UID: \"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8\") " pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.343661 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.561315 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8h86"] Dec 16 07:01:28 crc kubenswrapper[4823]: I1216 07:01:28.730521 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wnh5n"] Dec 16 07:01:28 crc kubenswrapper[4823]: W1216 07:01:28.775930 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7615bc49_d6d1_4933_8ef3_f7d871a8d4b8.slice/crio-b31798cd8f9678604c4edcd532be5391d4603876033da52bcc5ac94ab72b550c WatchSource:0}: Error finding container b31798cd8f9678604c4edcd532be5391d4603876033da52bcc5ac94ab72b550c: Status 404 returned error can't find the container with id b31798cd8f9678604c4edcd532be5391d4603876033da52bcc5ac94ab72b550c Dec 16 07:01:29 crc kubenswrapper[4823]: I1216 07:01:29.259632 4823 generic.go:334] "Generic (PLEG): container finished" podID="7615bc49-d6d1-4933-8ef3-f7d871a8d4b8" containerID="32f415f2e535a507304e783254e75fbdc7f77f560339008ae57b29825c693488" exitCode=0 Dec 16 07:01:29 crc kubenswrapper[4823]: I1216 07:01:29.260514 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnh5n" event={"ID":"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8","Type":"ContainerDied","Data":"32f415f2e535a507304e783254e75fbdc7f77f560339008ae57b29825c693488"} Dec 16 07:01:29 crc kubenswrapper[4823]: I1216 07:01:29.260545 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnh5n" event={"ID":"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8","Type":"ContainerStarted","Data":"b31798cd8f9678604c4edcd532be5391d4603876033da52bcc5ac94ab72b550c"} Dec 16 07:01:29 crc kubenswrapper[4823]: I1216 07:01:29.262725 4823 generic.go:334] "Generic (PLEG): container finished" podID="d1a83968-f624-48a1-a47a-3ad405b3b53c" containerID="42f06f0b22f43a147afb6e257ac9ddbbfbed2d9a3e5994d796860f32994dd3b1" exitCode=0 Dec 16 07:01:29 crc kubenswrapper[4823]: I1216 07:01:29.262758 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8h86" event={"ID":"d1a83968-f624-48a1-a47a-3ad405b3b53c","Type":"ContainerDied","Data":"42f06f0b22f43a147afb6e257ac9ddbbfbed2d9a3e5994d796860f32994dd3b1"} Dec 16 07:01:29 crc kubenswrapper[4823]: I1216 07:01:29.262785 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8h86" event={"ID":"d1a83968-f624-48a1-a47a-3ad405b3b53c","Type":"ContainerStarted","Data":"4181be5711ee8bf9df27273ef8afaf1fba5243b02e741f4793bc78ac0ac3011c"} Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.213146 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4f255"] Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.214728 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.220697 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.226264 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4f255"] Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.271961 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnh5n" event={"ID":"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8","Type":"ContainerStarted","Data":"27e7f2b7d104711988aa1d0e6776b104285acada681ffe7f4ea5f978bc7b7bdb"} Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.352701 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-catalog-content\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.352786 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-utilities\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.352846 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27t9g\" (UniqueName: \"kubernetes.io/projected/af52958a-a702-46cf-b108-0d6f3227d7f5-kube-api-access-27t9g\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.415783 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-46558"] Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.417185 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.419425 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.424443 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46558"] Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.453915 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27t9g\" (UniqueName: \"kubernetes.io/projected/af52958a-a702-46cf-b108-0d6f3227d7f5-kube-api-access-27t9g\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.454007 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-catalog-content\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.454065 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-utilities\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.454665 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-utilities\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.455246 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-catalog-content\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.497491 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27t9g\" (UniqueName: \"kubernetes.io/projected/af52958a-a702-46cf-b108-0d6f3227d7f5-kube-api-access-27t9g\") pod \"community-operators-4f255\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.540859 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.556048 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2582ab05-12e8-48c6-ac08-2673b110e34f-catalog-content\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.556363 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2582ab05-12e8-48c6-ac08-2673b110e34f-utilities\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.556418 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2g9s\" (UniqueName: \"kubernetes.io/projected/2582ab05-12e8-48c6-ac08-2673b110e34f-kube-api-access-j2g9s\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.658254 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2582ab05-12e8-48c6-ac08-2673b110e34f-utilities\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.658355 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2g9s\" (UniqueName: \"kubernetes.io/projected/2582ab05-12e8-48c6-ac08-2673b110e34f-kube-api-access-j2g9s\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.658392 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2582ab05-12e8-48c6-ac08-2673b110e34f-catalog-content\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.659128 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2582ab05-12e8-48c6-ac08-2673b110e34f-catalog-content\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.659259 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2582ab05-12e8-48c6-ac08-2673b110e34f-utilities\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.681738 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2g9s\" (UniqueName: \"kubernetes.io/projected/2582ab05-12e8-48c6-ac08-2673b110e34f-kube-api-access-j2g9s\") pod \"certified-operators-46558\" (UID: \"2582ab05-12e8-48c6-ac08-2673b110e34f\") " pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.743626 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:30 crc kubenswrapper[4823]: I1216 07:01:30.953965 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4f255"] Dec 16 07:01:30 crc kubenswrapper[4823]: W1216 07:01:30.987289 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf52958a_a702_46cf_b108_0d6f3227d7f5.slice/crio-ed18e7986e45a46db8ff1b975fa4253c553e091212ecbbff70730221a8927f9b WatchSource:0}: Error finding container ed18e7986e45a46db8ff1b975fa4253c553e091212ecbbff70730221a8927f9b: Status 404 returned error can't find the container with id ed18e7986e45a46db8ff1b975fa4253c553e091212ecbbff70730221a8927f9b Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.168839 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46558"] Dec 16 07:01:31 crc kubenswrapper[4823]: W1216 07:01:31.194523 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2582ab05_12e8_48c6_ac08_2673b110e34f.slice/crio-d2907a7666442738fb1e35566f886c679af2b1884ae18ee6397ee1a74dff254d WatchSource:0}: Error finding container d2907a7666442738fb1e35566f886c679af2b1884ae18ee6397ee1a74dff254d: Status 404 returned error can't find the container with id d2907a7666442738fb1e35566f886c679af2b1884ae18ee6397ee1a74dff254d Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.280750 4823 generic.go:334] "Generic (PLEG): container finished" podID="d1a83968-f624-48a1-a47a-3ad405b3b53c" containerID="78c86b5e53ecc46d0de498cf410d16fe85b2ea2dcb76ad2f20980c0f066b0978" exitCode=0 Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.280869 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8h86" event={"ID":"d1a83968-f624-48a1-a47a-3ad405b3b53c","Type":"ContainerDied","Data":"78c86b5e53ecc46d0de498cf410d16fe85b2ea2dcb76ad2f20980c0f066b0978"} Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.286285 4823 generic.go:334] "Generic (PLEG): container finished" podID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerID="cc92b6c7602edd8095f5057c411e2bf5a4368647e2660dc2b4ae8c911a811d65" exitCode=0 Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.286526 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f255" event={"ID":"af52958a-a702-46cf-b108-0d6f3227d7f5","Type":"ContainerDied","Data":"cc92b6c7602edd8095f5057c411e2bf5a4368647e2660dc2b4ae8c911a811d65"} Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.286754 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f255" event={"ID":"af52958a-a702-46cf-b108-0d6f3227d7f5","Type":"ContainerStarted","Data":"ed18e7986e45a46db8ff1b975fa4253c553e091212ecbbff70730221a8927f9b"} Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.289521 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46558" event={"ID":"2582ab05-12e8-48c6-ac08-2673b110e34f","Type":"ContainerStarted","Data":"d2907a7666442738fb1e35566f886c679af2b1884ae18ee6397ee1a74dff254d"} Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.292158 4823 generic.go:334] "Generic (PLEG): container finished" podID="7615bc49-d6d1-4933-8ef3-f7d871a8d4b8" containerID="27e7f2b7d104711988aa1d0e6776b104285acada681ffe7f4ea5f978bc7b7bdb" exitCode=0 Dec 16 07:01:31 crc kubenswrapper[4823]: I1216 07:01:31.292196 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnh5n" event={"ID":"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8","Type":"ContainerDied","Data":"27e7f2b7d104711988aa1d0e6776b104285acada681ffe7f4ea5f978bc7b7bdb"} Dec 16 07:01:32 crc kubenswrapper[4823]: I1216 07:01:32.302828 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnh5n" event={"ID":"7615bc49-d6d1-4933-8ef3-f7d871a8d4b8","Type":"ContainerStarted","Data":"afeafbbdeac3a0e0ef257c3ee163d991b8485baa999b04f963b22fe10b562171"} Dec 16 07:01:32 crc kubenswrapper[4823]: I1216 07:01:32.305162 4823 generic.go:334] "Generic (PLEG): container finished" podID="2582ab05-12e8-48c6-ac08-2673b110e34f" containerID="edd736760019adaae61ac6a7093c875f8ffe8a251c9828e5feb4f522b717ad49" exitCode=0 Dec 16 07:01:32 crc kubenswrapper[4823]: I1216 07:01:32.305243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46558" event={"ID":"2582ab05-12e8-48c6-ac08-2673b110e34f","Type":"ContainerDied","Data":"edd736760019adaae61ac6a7093c875f8ffe8a251c9828e5feb4f522b717ad49"} Dec 16 07:01:32 crc kubenswrapper[4823]: I1216 07:01:32.321756 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wnh5n" podStartSLOduration=2.843018786 podStartE2EDuration="5.321724179s" podCreationTimestamp="2025-12-16 07:01:27 +0000 UTC" firstStartedPulling="2025-12-16 07:01:29.261676451 +0000 UTC m=+367.750242564" lastFinishedPulling="2025-12-16 07:01:31.740381834 +0000 UTC m=+370.228947957" observedRunningTime="2025-12-16 07:01:32.319666386 +0000 UTC m=+370.808232509" watchObservedRunningTime="2025-12-16 07:01:32.321724179 +0000 UTC m=+370.810290302" Dec 16 07:01:33 crc kubenswrapper[4823]: I1216 07:01:33.317071 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8h86" event={"ID":"d1a83968-f624-48a1-a47a-3ad405b3b53c","Type":"ContainerStarted","Data":"b422b566248d4bad4ace7a097243d1ca965136c6aad18e0826e6a4fdfdb7e6ae"} Dec 16 07:01:33 crc kubenswrapper[4823]: I1216 07:01:33.318982 4823 generic.go:334] "Generic (PLEG): container finished" podID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerID="4770362034bafdeedc02490ef46fcaf61f520c1a895229ec8bf02a2d742d0951" exitCode=0 Dec 16 07:01:33 crc kubenswrapper[4823]: I1216 07:01:33.319099 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f255" event={"ID":"af52958a-a702-46cf-b108-0d6f3227d7f5","Type":"ContainerDied","Data":"4770362034bafdeedc02490ef46fcaf61f520c1a895229ec8bf02a2d742d0951"} Dec 16 07:01:33 crc kubenswrapper[4823]: I1216 07:01:33.322386 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46558" event={"ID":"2582ab05-12e8-48c6-ac08-2673b110e34f","Type":"ContainerStarted","Data":"cfb2ee15e82cf7c20bc5af81c358ad8bbfc29d27050c7143860ada6e53cc5c52"} Dec 16 07:01:33 crc kubenswrapper[4823]: I1216 07:01:33.341089 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c8h86" podStartSLOduration=3.428436622 podStartE2EDuration="6.341057521s" podCreationTimestamp="2025-12-16 07:01:27 +0000 UTC" firstStartedPulling="2025-12-16 07:01:29.276662402 +0000 UTC m=+367.765228535" lastFinishedPulling="2025-12-16 07:01:32.189283311 +0000 UTC m=+370.677849434" observedRunningTime="2025-12-16 07:01:33.337888155 +0000 UTC m=+371.826454278" watchObservedRunningTime="2025-12-16 07:01:33.341057521 +0000 UTC m=+371.829623644" Dec 16 07:01:34 crc kubenswrapper[4823]: I1216 07:01:34.330449 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f255" event={"ID":"af52958a-a702-46cf-b108-0d6f3227d7f5","Type":"ContainerStarted","Data":"350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c"} Dec 16 07:01:34 crc kubenswrapper[4823]: I1216 07:01:34.332852 4823 generic.go:334] "Generic (PLEG): container finished" podID="2582ab05-12e8-48c6-ac08-2673b110e34f" containerID="cfb2ee15e82cf7c20bc5af81c358ad8bbfc29d27050c7143860ada6e53cc5c52" exitCode=0 Dec 16 07:01:34 crc kubenswrapper[4823]: I1216 07:01:34.332928 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46558" event={"ID":"2582ab05-12e8-48c6-ac08-2673b110e34f","Type":"ContainerDied","Data":"cfb2ee15e82cf7c20bc5af81c358ad8bbfc29d27050c7143860ada6e53cc5c52"} Dec 16 07:01:34 crc kubenswrapper[4823]: I1216 07:01:34.359699 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4f255" podStartSLOduration=1.887899386 podStartE2EDuration="4.359618949s" podCreationTimestamp="2025-12-16 07:01:30 +0000 UTC" firstStartedPulling="2025-12-16 07:01:31.289125307 +0000 UTC m=+369.777691430" lastFinishedPulling="2025-12-16 07:01:33.76084487 +0000 UTC m=+372.249410993" observedRunningTime="2025-12-16 07:01:34.350559906 +0000 UTC m=+372.839126029" watchObservedRunningTime="2025-12-16 07:01:34.359618949 +0000 UTC m=+372.848185082" Dec 16 07:01:35 crc kubenswrapper[4823]: I1216 07:01:35.350430 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46558" event={"ID":"2582ab05-12e8-48c6-ac08-2673b110e34f","Type":"ContainerStarted","Data":"7b490d45c334c214d50165a4990724c173e50ba6bf67588c1d66547df3a57688"} Dec 16 07:01:35 crc kubenswrapper[4823]: I1216 07:01:35.374509 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-46558" podStartSLOduration=2.805870476 podStartE2EDuration="5.374469776s" podCreationTimestamp="2025-12-16 07:01:30 +0000 UTC" firstStartedPulling="2025-12-16 07:01:32.306950304 +0000 UTC m=+370.795516427" lastFinishedPulling="2025-12-16 07:01:34.875549604 +0000 UTC m=+373.364115727" observedRunningTime="2025-12-16 07:01:35.367102124 +0000 UTC m=+373.855668267" watchObservedRunningTime="2025-12-16 07:01:35.374469776 +0000 UTC m=+373.863035909" Dec 16 07:01:37 crc kubenswrapper[4823]: I1216 07:01:37.761974 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w7fhp" Dec 16 07:01:37 crc kubenswrapper[4823]: I1216 07:01:37.831396 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmx2d"] Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.150259 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.150700 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.195981 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.343936 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.344013 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.387590 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.415302 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c8h86" Dec 16 07:01:38 crc kubenswrapper[4823]: I1216 07:01:38.431765 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wnh5n" Dec 16 07:01:40 crc kubenswrapper[4823]: I1216 07:01:40.541244 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:40 crc kubenswrapper[4823]: I1216 07:01:40.541893 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:40 crc kubenswrapper[4823]: I1216 07:01:40.589412 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:40 crc kubenswrapper[4823]: I1216 07:01:40.744430 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:40 crc kubenswrapper[4823]: I1216 07:01:40.744519 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:40 crc kubenswrapper[4823]: I1216 07:01:40.806108 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:41 crc kubenswrapper[4823]: I1216 07:01:41.428247 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:01:41 crc kubenswrapper[4823]: I1216 07:01:41.439972 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-46558" Dec 16 07:01:58 crc kubenswrapper[4823]: I1216 07:01:58.134092 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:01:58 crc kubenswrapper[4823]: I1216 07:01:58.134738 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:02:02 crc kubenswrapper[4823]: I1216 07:02:02.879146 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" podUID="0a48b03b-402f-48b1-a3b7-52690850de42" containerName="registry" containerID="cri-o://83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71" gracePeriod=30 Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.241478 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.387809 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-registry-tls\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.387936 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a48b03b-402f-48b1-a3b7-52690850de42-ca-trust-extracted\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.389398 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-registry-certificates\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.389616 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.389660 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g474n\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-kube-api-access-g474n\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.389731 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a48b03b-402f-48b1-a3b7-52690850de42-installation-pull-secrets\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.389791 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-trusted-ca\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.390367 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.390702 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-bound-sa-token\") pod \"0a48b03b-402f-48b1-a3b7-52690850de42\" (UID: \"0a48b03b-402f-48b1-a3b7-52690850de42\") " Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.390897 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.391168 4823 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.391190 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a48b03b-402f-48b1-a3b7-52690850de42-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.396474 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.397214 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a48b03b-402f-48b1-a3b7-52690850de42-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.397252 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.397881 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-kube-api-access-g474n" (OuterVolumeSpecName: "kube-api-access-g474n") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "kube-api-access-g474n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.399790 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.409216 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a48b03b-402f-48b1-a3b7-52690850de42-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0a48b03b-402f-48b1-a3b7-52690850de42" (UID: "0a48b03b-402f-48b1-a3b7-52690850de42"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.492685 4823 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.492739 4823 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a48b03b-402f-48b1-a3b7-52690850de42-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.492758 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g474n\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-kube-api-access-g474n\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.492773 4823 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a48b03b-402f-48b1-a3b7-52690850de42-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.492782 4823 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a48b03b-402f-48b1-a3b7-52690850de42-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.543076 4823 generic.go:334] "Generic (PLEG): container finished" podID="0a48b03b-402f-48b1-a3b7-52690850de42" containerID="83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71" exitCode=0 Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.543143 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.543165 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" event={"ID":"0a48b03b-402f-48b1-a3b7-52690850de42","Type":"ContainerDied","Data":"83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71"} Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.543225 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmx2d" event={"ID":"0a48b03b-402f-48b1-a3b7-52690850de42","Type":"ContainerDied","Data":"ccfba756a351719dde65ce1613b1a618a93494ddad1f764acf048b5f5aafbe29"} Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.543253 4823 scope.go:117] "RemoveContainer" containerID="83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.563865 4823 scope.go:117] "RemoveContainer" containerID="83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71" Dec 16 07:02:03 crc kubenswrapper[4823]: E1216 07:02:03.564468 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71\": container with ID starting with 83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71 not found: ID does not exist" containerID="83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.564502 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71"} err="failed to get container status \"83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71\": rpc error: code = NotFound desc = could not find container \"83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71\": container with ID starting with 83a366cf745862e6cccd999622a3fc3bfdc1e8d365d834c238e2aeb8c9f6df71 not found: ID does not exist" Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.573214 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmx2d"] Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.576749 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmx2d"] Dec 16 07:02:03 crc kubenswrapper[4823]: I1216 07:02:03.778942 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a48b03b-402f-48b1-a3b7-52690850de42" path="/var/lib/kubelet/pods/0a48b03b-402f-48b1-a3b7-52690850de42/volumes" Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.134125 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.134940 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.135008 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.135967 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c937cb280cb5355361e27a4b204cc11ced2636f489f0b890dda44110baac59b"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.136073 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://3c937cb280cb5355361e27a4b204cc11ced2636f489f0b890dda44110baac59b" gracePeriod=600 Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.712819 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="3c937cb280cb5355361e27a4b204cc11ced2636f489f0b890dda44110baac59b" exitCode=0 Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.712945 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"3c937cb280cb5355361e27a4b204cc11ced2636f489f0b890dda44110baac59b"} Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.713446 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"a74b2e27a775c5c30f13e66fdd69513578a6852a8acac9edf5ac9cd313af855e"} Dec 16 07:02:28 crc kubenswrapper[4823]: I1216 07:02:28.713484 4823 scope.go:117] "RemoveContainer" containerID="78aed5cdf8d3a29dfffcdabfd18b38c0cf76225853078e01602198bee4994536" Dec 16 07:04:28 crc kubenswrapper[4823]: I1216 07:04:28.134913 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:04:28 crc kubenswrapper[4823]: I1216 07:04:28.137180 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:04:58 crc kubenswrapper[4823]: I1216 07:04:58.134809 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:04:58 crc kubenswrapper[4823]: I1216 07:04:58.135603 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.134492 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.136059 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.136184 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.137267 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a74b2e27a775c5c30f13e66fdd69513578a6852a8acac9edf5ac9cd313af855e"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.137394 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://a74b2e27a775c5c30f13e66fdd69513578a6852a8acac9edf5ac9cd313af855e" gracePeriod=600 Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.956739 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="a74b2e27a775c5c30f13e66fdd69513578a6852a8acac9edf5ac9cd313af855e" exitCode=0 Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.956837 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"a74b2e27a775c5c30f13e66fdd69513578a6852a8acac9edf5ac9cd313af855e"} Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.957458 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"b2c9252cf9a9f07ff8ef785fb449e76ce7b1db97459e712ab514f7ac68ccf0cb"} Dec 16 07:05:28 crc kubenswrapper[4823]: I1216 07:05:28.957502 4823 scope.go:117] "RemoveContainer" containerID="3c937cb280cb5355361e27a4b204cc11ced2636f489f0b890dda44110baac59b" Dec 16 07:07:28 crc kubenswrapper[4823]: I1216 07:07:28.134618 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:07:28 crc kubenswrapper[4823]: I1216 07:07:28.135532 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:07:49 crc kubenswrapper[4823]: I1216 07:07:49.945235 4823 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 07:07:58 crc kubenswrapper[4823]: I1216 07:07:58.134377 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:07:58 crc kubenswrapper[4823]: I1216 07:07:58.135182 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:08:28 crc kubenswrapper[4823]: I1216 07:08:28.134398 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:08:28 crc kubenswrapper[4823]: I1216 07:08:28.135196 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:08:28 crc kubenswrapper[4823]: I1216 07:08:28.135286 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:08:28 crc kubenswrapper[4823]: I1216 07:08:28.136344 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2c9252cf9a9f07ff8ef785fb449e76ce7b1db97459e712ab514f7ac68ccf0cb"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:08:28 crc kubenswrapper[4823]: I1216 07:08:28.136477 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://b2c9252cf9a9f07ff8ef785fb449e76ce7b1db97459e712ab514f7ac68ccf0cb" gracePeriod=600 Dec 16 07:08:29 crc kubenswrapper[4823]: I1216 07:08:29.198483 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="b2c9252cf9a9f07ff8ef785fb449e76ce7b1db97459e712ab514f7ac68ccf0cb" exitCode=0 Dec 16 07:08:29 crc kubenswrapper[4823]: I1216 07:08:29.198604 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"b2c9252cf9a9f07ff8ef785fb449e76ce7b1db97459e712ab514f7ac68ccf0cb"} Dec 16 07:08:29 crc kubenswrapper[4823]: I1216 07:08:29.199156 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"48219d3c0e584aed9d175a58b4a139883d9d4f8a627e33b1552f22d85e485c5c"} Dec 16 07:08:29 crc kubenswrapper[4823]: I1216 07:08:29.199218 4823 scope.go:117] "RemoveContainer" containerID="a74b2e27a775c5c30f13e66fdd69513578a6852a8acac9edf5ac9cd313af855e" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.361010 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkvw"] Dec 16 07:09:42 crc kubenswrapper[4823]: E1216 07:09:42.361968 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a48b03b-402f-48b1-a3b7-52690850de42" containerName="registry" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.361988 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a48b03b-402f-48b1-a3b7-52690850de42" containerName="registry" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.362160 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a48b03b-402f-48b1-a3b7-52690850de42" containerName="registry" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.364823 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.368254 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkvw"] Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.468658 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-utilities\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.468979 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x56mv\" (UniqueName: \"kubernetes.io/projected/e0c9b09f-c898-4625-bf97-f912b3bc7bed-kube-api-access-x56mv\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.469067 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-catalog-content\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.570571 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-catalog-content\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.570816 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-utilities\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.570875 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x56mv\" (UniqueName: \"kubernetes.io/projected/e0c9b09f-c898-4625-bf97-f912b3bc7bed-kube-api-access-x56mv\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.571451 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-utilities\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.571620 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-catalog-content\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.588719 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x56mv\" (UniqueName: \"kubernetes.io/projected/e0c9b09f-c898-4625-bf97-f912b3bc7bed-kube-api-access-x56mv\") pod \"redhat-marketplace-vnkvw\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.701310 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:42 crc kubenswrapper[4823]: I1216 07:09:42.905268 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkvw"] Dec 16 07:09:43 crc kubenswrapper[4823]: I1216 07:09:43.746299 4823 generic.go:334] "Generic (PLEG): container finished" podID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerID="7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d" exitCode=0 Dec 16 07:09:43 crc kubenswrapper[4823]: I1216 07:09:43.746372 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkvw" event={"ID":"e0c9b09f-c898-4625-bf97-f912b3bc7bed","Type":"ContainerDied","Data":"7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d"} Dec 16 07:09:43 crc kubenswrapper[4823]: I1216 07:09:43.746564 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkvw" event={"ID":"e0c9b09f-c898-4625-bf97-f912b3bc7bed","Type":"ContainerStarted","Data":"7d8aff3b3f15e1121e24bc5a60fe6c4ea7caa7409696158e977b9e4f91da4965"} Dec 16 07:09:43 crc kubenswrapper[4823]: I1216 07:09:43.749460 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:09:44 crc kubenswrapper[4823]: I1216 07:09:44.755535 4823 generic.go:334] "Generic (PLEG): container finished" podID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerID="870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56" exitCode=0 Dec 16 07:09:44 crc kubenswrapper[4823]: I1216 07:09:44.755595 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkvw" event={"ID":"e0c9b09f-c898-4625-bf97-f912b3bc7bed","Type":"ContainerDied","Data":"870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56"} Dec 16 07:09:45 crc kubenswrapper[4823]: I1216 07:09:45.762775 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkvw" event={"ID":"e0c9b09f-c898-4625-bf97-f912b3bc7bed","Type":"ContainerStarted","Data":"860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392"} Dec 16 07:09:45 crc kubenswrapper[4823]: I1216 07:09:45.784317 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vnkvw" podStartSLOduration=2.320607796 podStartE2EDuration="3.78429616s" podCreationTimestamp="2025-12-16 07:09:42 +0000 UTC" firstStartedPulling="2025-12-16 07:09:43.749254521 +0000 UTC m=+862.237820644" lastFinishedPulling="2025-12-16 07:09:45.212942885 +0000 UTC m=+863.701509008" observedRunningTime="2025-12-16 07:09:45.782759662 +0000 UTC m=+864.271325795" watchObservedRunningTime="2025-12-16 07:09:45.78429616 +0000 UTC m=+864.272862283" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.340741 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8kwpm"] Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.343104 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.358934 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8kwpm"] Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.446291 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-catalog-content\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.446533 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69lv\" (UniqueName: \"kubernetes.io/projected/0072530b-0b12-4bb7-943e-48eae2c7f6b1-kube-api-access-t69lv\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.446627 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-utilities\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.548087 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-catalog-content\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.548148 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t69lv\" (UniqueName: \"kubernetes.io/projected/0072530b-0b12-4bb7-943e-48eae2c7f6b1-kube-api-access-t69lv\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.548187 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-utilities\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.548693 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-utilities\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.548842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-catalog-content\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.576770 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69lv\" (UniqueName: \"kubernetes.io/projected/0072530b-0b12-4bb7-943e-48eae2c7f6b1-kube-api-access-t69lv\") pod \"certified-operators-8kwpm\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.674914 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:48 crc kubenswrapper[4823]: I1216 07:09:48.967579 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8kwpm"] Dec 16 07:09:49 crc kubenswrapper[4823]: I1216 07:09:49.788273 4823 generic.go:334] "Generic (PLEG): container finished" podID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerID="982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08" exitCode=0 Dec 16 07:09:49 crc kubenswrapper[4823]: I1216 07:09:49.788381 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerDied","Data":"982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08"} Dec 16 07:09:49 crc kubenswrapper[4823]: I1216 07:09:49.788589 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerStarted","Data":"be9a9fb50a6ac2395cb131565bc55cd975bb8dc582a1f04992fcc404fcbcdfc1"} Dec 16 07:09:50 crc kubenswrapper[4823]: I1216 07:09:50.797188 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerStarted","Data":"a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119"} Dec 16 07:09:51 crc kubenswrapper[4823]: I1216 07:09:51.807393 4823 generic.go:334] "Generic (PLEG): container finished" podID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerID="a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119" exitCode=0 Dec 16 07:09:51 crc kubenswrapper[4823]: I1216 07:09:51.807755 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerDied","Data":"a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119"} Dec 16 07:09:52 crc kubenswrapper[4823]: I1216 07:09:52.701808 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:52 crc kubenswrapper[4823]: I1216 07:09:52.701866 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:52 crc kubenswrapper[4823]: I1216 07:09:52.758986 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:52 crc kubenswrapper[4823]: I1216 07:09:52.815634 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerStarted","Data":"19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f"} Dec 16 07:09:52 crc kubenswrapper[4823]: I1216 07:09:52.834437 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8kwpm" podStartSLOduration=2.374830951 podStartE2EDuration="4.834417189s" podCreationTimestamp="2025-12-16 07:09:48 +0000 UTC" firstStartedPulling="2025-12-16 07:09:49.794336655 +0000 UTC m=+868.282902778" lastFinishedPulling="2025-12-16 07:09:52.253922883 +0000 UTC m=+870.742489016" observedRunningTime="2025-12-16 07:09:52.831326451 +0000 UTC m=+871.319892584" watchObservedRunningTime="2025-12-16 07:09:52.834417189 +0000 UTC m=+871.322983312" Dec 16 07:09:52 crc kubenswrapper[4823]: I1216 07:09:52.855224 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.132122 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkvw"] Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.134643 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vnkvw" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="registry-server" containerID="cri-o://860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392" gracePeriod=2 Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.487510 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.637548 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x56mv\" (UniqueName: \"kubernetes.io/projected/e0c9b09f-c898-4625-bf97-f912b3bc7bed-kube-api-access-x56mv\") pod \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.637885 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-utilities\") pod \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.637967 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-catalog-content\") pod \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\" (UID: \"e0c9b09f-c898-4625-bf97-f912b3bc7bed\") " Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.639374 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-utilities" (OuterVolumeSpecName: "utilities") pod "e0c9b09f-c898-4625-bf97-f912b3bc7bed" (UID: "e0c9b09f-c898-4625-bf97-f912b3bc7bed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.659883 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c9b09f-c898-4625-bf97-f912b3bc7bed-kube-api-access-x56mv" (OuterVolumeSpecName: "kube-api-access-x56mv") pod "e0c9b09f-c898-4625-bf97-f912b3bc7bed" (UID: "e0c9b09f-c898-4625-bf97-f912b3bc7bed"). InnerVolumeSpecName "kube-api-access-x56mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.665451 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0c9b09f-c898-4625-bf97-f912b3bc7bed" (UID: "e0c9b09f-c898-4625-bf97-f912b3bc7bed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.739574 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.739614 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x56mv\" (UniqueName: \"kubernetes.io/projected/e0c9b09f-c898-4625-bf97-f912b3bc7bed-kube-api-access-x56mv\") on node \"crc\" DevicePath \"\"" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.739630 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0c9b09f-c898-4625-bf97-f912b3bc7bed-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.834518 4823 generic.go:334] "Generic (PLEG): container finished" podID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerID="860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392" exitCode=0 Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.834572 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkvw" event={"ID":"e0c9b09f-c898-4625-bf97-f912b3bc7bed","Type":"ContainerDied","Data":"860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392"} Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.834606 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkvw" event={"ID":"e0c9b09f-c898-4625-bf97-f912b3bc7bed","Type":"ContainerDied","Data":"7d8aff3b3f15e1121e24bc5a60fe6c4ea7caa7409696158e977b9e4f91da4965"} Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.834611 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnkvw" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.834628 4823 scope.go:117] "RemoveContainer" containerID="860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.851851 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkvw"] Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.851918 4823 scope.go:117] "RemoveContainer" containerID="870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.862585 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkvw"] Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.875293 4823 scope.go:117] "RemoveContainer" containerID="7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.907552 4823 scope.go:117] "RemoveContainer" containerID="860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392" Dec 16 07:09:55 crc kubenswrapper[4823]: E1216 07:09:55.908037 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392\": container with ID starting with 860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392 not found: ID does not exist" containerID="860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.908074 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392"} err="failed to get container status \"860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392\": rpc error: code = NotFound desc = could not find container \"860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392\": container with ID starting with 860af0e052dca4ff79ba718659c0107936f18859d2250e32b74ae08fb9bae392 not found: ID does not exist" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.908097 4823 scope.go:117] "RemoveContainer" containerID="870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56" Dec 16 07:09:55 crc kubenswrapper[4823]: E1216 07:09:55.908519 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56\": container with ID starting with 870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56 not found: ID does not exist" containerID="870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.908551 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56"} err="failed to get container status \"870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56\": rpc error: code = NotFound desc = could not find container \"870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56\": container with ID starting with 870bb3487bd8cc710ac63b02b221a161ec71ac17b3eaa88ebee803518b886a56 not found: ID does not exist" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.908569 4823 scope.go:117] "RemoveContainer" containerID="7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d" Dec 16 07:09:55 crc kubenswrapper[4823]: E1216 07:09:55.908817 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d\": container with ID starting with 7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d not found: ID does not exist" containerID="7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d" Dec 16 07:09:55 crc kubenswrapper[4823]: I1216 07:09:55.908859 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d"} err="failed to get container status \"7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d\": rpc error: code = NotFound desc = could not find container \"7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d\": container with ID starting with 7af93d466cc57d1592a85a38d677ec7addca8548b67b235eb22afaaea6f9d50d not found: ID does not exist" Dec 16 07:09:57 crc kubenswrapper[4823]: I1216 07:09:57.786639 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" path="/var/lib/kubelet/pods/e0c9b09f-c898-4625-bf97-f912b3bc7bed/volumes" Dec 16 07:09:58 crc kubenswrapper[4823]: I1216 07:09:58.675130 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:58 crc kubenswrapper[4823]: I1216 07:09:58.675631 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:58 crc kubenswrapper[4823]: I1216 07:09:58.736241 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:09:58 crc kubenswrapper[4823]: I1216 07:09:58.899119 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:10:00 crc kubenswrapper[4823]: I1216 07:10:00.125482 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8kwpm"] Dec 16 07:10:01 crc kubenswrapper[4823]: I1216 07:10:01.870256 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8kwpm" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="registry-server" containerID="cri-o://19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f" gracePeriod=2 Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.572363 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69dj9"] Dec 16 07:10:02 crc kubenswrapper[4823]: E1216 07:10:02.572894 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="extract-utilities" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.572914 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="extract-utilities" Dec 16 07:10:02 crc kubenswrapper[4823]: E1216 07:10:02.572929 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="registry-server" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.572937 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="registry-server" Dec 16 07:10:02 crc kubenswrapper[4823]: E1216 07:10:02.572975 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="extract-content" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.572984 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="extract-content" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.573135 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c9b09f-c898-4625-bf97-f912b3bc7bed" containerName="registry-server" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.575003 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.582438 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69dj9"] Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.652841 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-catalog-content\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.652912 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-utilities\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.653054 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42f7\" (UniqueName: \"kubernetes.io/projected/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-kube-api-access-c42f7\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.754094 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-utilities\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.754155 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42f7\" (UniqueName: \"kubernetes.io/projected/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-kube-api-access-c42f7\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.754232 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-catalog-content\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.754705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-catalog-content\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.754707 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-utilities\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.772788 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.774335 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42f7\" (UniqueName: \"kubernetes.io/projected/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-kube-api-access-c42f7\") pod \"community-operators-69dj9\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.854698 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-utilities\") pod \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.854748 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-catalog-content\") pod \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.854824 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t69lv\" (UniqueName: \"kubernetes.io/projected/0072530b-0b12-4bb7-943e-48eae2c7f6b1-kube-api-access-t69lv\") pod \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\" (UID: \"0072530b-0b12-4bb7-943e-48eae2c7f6b1\") " Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.855927 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-utilities" (OuterVolumeSpecName: "utilities") pod "0072530b-0b12-4bb7-943e-48eae2c7f6b1" (UID: "0072530b-0b12-4bb7-943e-48eae2c7f6b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.859254 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0072530b-0b12-4bb7-943e-48eae2c7f6b1-kube-api-access-t69lv" (OuterVolumeSpecName: "kube-api-access-t69lv") pod "0072530b-0b12-4bb7-943e-48eae2c7f6b1" (UID: "0072530b-0b12-4bb7-943e-48eae2c7f6b1"). InnerVolumeSpecName "kube-api-access-t69lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.877312 4823 generic.go:334] "Generic (PLEG): container finished" podID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerID="19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f" exitCode=0 Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.877354 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerDied","Data":"19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f"} Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.877380 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8kwpm" event={"ID":"0072530b-0b12-4bb7-943e-48eae2c7f6b1","Type":"ContainerDied","Data":"be9a9fb50a6ac2395cb131565bc55cd975bb8dc582a1f04992fcc404fcbcdfc1"} Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.877397 4823 scope.go:117] "RemoveContainer" containerID="19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.877399 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8kwpm" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.897515 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.903130 4823 scope.go:117] "RemoveContainer" containerID="a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.920307 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0072530b-0b12-4bb7-943e-48eae2c7f6b1" (UID: "0072530b-0b12-4bb7-943e-48eae2c7f6b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.928924 4823 scope.go:117] "RemoveContainer" containerID="982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.943295 4823 scope.go:117] "RemoveContainer" containerID="19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f" Dec 16 07:10:02 crc kubenswrapper[4823]: E1216 07:10:02.943687 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f\": container with ID starting with 19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f not found: ID does not exist" containerID="19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.943717 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f"} err="failed to get container status \"19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f\": rpc error: code = NotFound desc = could not find container \"19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f\": container with ID starting with 19445955a19d79bd4aa355c80ad116ad4917b0604bcdaac3f552b71233212e1f not found: ID does not exist" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.943739 4823 scope.go:117] "RemoveContainer" containerID="a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119" Dec 16 07:10:02 crc kubenswrapper[4823]: E1216 07:10:02.943948 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119\": container with ID starting with a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119 not found: ID does not exist" containerID="a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.943969 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119"} err="failed to get container status \"a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119\": rpc error: code = NotFound desc = could not find container \"a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119\": container with ID starting with a04f9d76b9f339563b4ce37495a41a4fe1695829cede773aec44299f249f0119 not found: ID does not exist" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.943981 4823 scope.go:117] "RemoveContainer" containerID="982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08" Dec 16 07:10:02 crc kubenswrapper[4823]: E1216 07:10:02.944192 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08\": container with ID starting with 982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08 not found: ID does not exist" containerID="982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.944208 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08"} err="failed to get container status \"982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08\": rpc error: code = NotFound desc = could not find container \"982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08\": container with ID starting with 982010b8dc16a609d26bf1dd138aea9ffc8b78bc2d465a95c082745265672b08 not found: ID does not exist" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.955762 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t69lv\" (UniqueName: \"kubernetes.io/projected/0072530b-0b12-4bb7-943e-48eae2c7f6b1-kube-api-access-t69lv\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.955802 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:02 crc kubenswrapper[4823]: I1216 07:10:02.955816 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0072530b-0b12-4bb7-943e-48eae2c7f6b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.189839 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69dj9"] Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.244340 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8kwpm"] Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.249338 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8kwpm"] Dec 16 07:10:03 crc kubenswrapper[4823]: E1216 07:10:03.306078 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0072530b_0b12_4bb7_943e_48eae2c7f6b1.slice/crio-be9a9fb50a6ac2395cb131565bc55cd975bb8dc582a1f04992fcc404fcbcdfc1\": RecentStats: unable to find data in memory cache]" Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.782876 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" path="/var/lib/kubelet/pods/0072530b-0b12-4bb7-943e-48eae2c7f6b1/volumes" Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.887963 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerID="6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd" exitCode=0 Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.888096 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69dj9" event={"ID":"6ee8d317-17b9-40a4-981c-39fc5c2aff0e","Type":"ContainerDied","Data":"6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd"} Dec 16 07:10:03 crc kubenswrapper[4823]: I1216 07:10:03.888135 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69dj9" event={"ID":"6ee8d317-17b9-40a4-981c-39fc5c2aff0e","Type":"ContainerStarted","Data":"cc461351d189980fb5ecb8bf36180c0a6897e5ccc3dca26e91c6ae232d8ff904"} Dec 16 07:10:04 crc kubenswrapper[4823]: I1216 07:10:04.908890 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerID="4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b" exitCode=0 Dec 16 07:10:04 crc kubenswrapper[4823]: I1216 07:10:04.908968 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69dj9" event={"ID":"6ee8d317-17b9-40a4-981c-39fc5c2aff0e","Type":"ContainerDied","Data":"4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b"} Dec 16 07:10:05 crc kubenswrapper[4823]: I1216 07:10:05.923937 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69dj9" event={"ID":"6ee8d317-17b9-40a4-981c-39fc5c2aff0e","Type":"ContainerStarted","Data":"58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125"} Dec 16 07:10:05 crc kubenswrapper[4823]: I1216 07:10:05.948411 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69dj9" podStartSLOduration=2.385783842 podStartE2EDuration="3.948396214s" podCreationTimestamp="2025-12-16 07:10:02 +0000 UTC" firstStartedPulling="2025-12-16 07:10:03.891847892 +0000 UTC m=+882.380414055" lastFinishedPulling="2025-12-16 07:10:05.454460264 +0000 UTC m=+883.943026427" observedRunningTime="2025-12-16 07:10:05.945186992 +0000 UTC m=+884.433753115" watchObservedRunningTime="2025-12-16 07:10:05.948396214 +0000 UTC m=+884.436962337" Dec 16 07:10:12 crc kubenswrapper[4823]: I1216 07:10:12.898471 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:12 crc kubenswrapper[4823]: I1216 07:10:12.898964 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:12 crc kubenswrapper[4823]: I1216 07:10:12.967576 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:13 crc kubenswrapper[4823]: I1216 07:10:13.010201 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:13 crc kubenswrapper[4823]: I1216 07:10:13.199199 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69dj9"] Dec 16 07:10:14 crc kubenswrapper[4823]: I1216 07:10:14.972171 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69dj9" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="registry-server" containerID="cri-o://58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125" gracePeriod=2 Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.321170 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.515731 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-catalog-content\") pod \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.515829 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42f7\" (UniqueName: \"kubernetes.io/projected/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-kube-api-access-c42f7\") pod \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.515924 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-utilities\") pod \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\" (UID: \"6ee8d317-17b9-40a4-981c-39fc5c2aff0e\") " Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.517254 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-utilities" (OuterVolumeSpecName: "utilities") pod "6ee8d317-17b9-40a4-981c-39fc5c2aff0e" (UID: "6ee8d317-17b9-40a4-981c-39fc5c2aff0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.523935 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-kube-api-access-c42f7" (OuterVolumeSpecName: "kube-api-access-c42f7") pod "6ee8d317-17b9-40a4-981c-39fc5c2aff0e" (UID: "6ee8d317-17b9-40a4-981c-39fc5c2aff0e"). InnerVolumeSpecName "kube-api-access-c42f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.564341 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ee8d317-17b9-40a4-981c-39fc5c2aff0e" (UID: "6ee8d317-17b9-40a4-981c-39fc5c2aff0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.617174 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.617202 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.617212 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42f7\" (UniqueName: \"kubernetes.io/projected/6ee8d317-17b9-40a4-981c-39fc5c2aff0e-kube-api-access-c42f7\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.987569 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerID="58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125" exitCode=0 Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.987720 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69dj9" event={"ID":"6ee8d317-17b9-40a4-981c-39fc5c2aff0e","Type":"ContainerDied","Data":"58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125"} Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.987819 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69dj9" event={"ID":"6ee8d317-17b9-40a4-981c-39fc5c2aff0e","Type":"ContainerDied","Data":"cc461351d189980fb5ecb8bf36180c0a6897e5ccc3dca26e91c6ae232d8ff904"} Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.987906 4823 scope.go:117] "RemoveContainer" containerID="58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125" Dec 16 07:10:15 crc kubenswrapper[4823]: I1216 07:10:15.987645 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69dj9" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.009346 4823 scope.go:117] "RemoveContainer" containerID="4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.015047 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69dj9"] Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.027339 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69dj9"] Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.039378 4823 scope.go:117] "RemoveContainer" containerID="6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.058836 4823 scope.go:117] "RemoveContainer" containerID="58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125" Dec 16 07:10:16 crc kubenswrapper[4823]: E1216 07:10:16.059501 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125\": container with ID starting with 58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125 not found: ID does not exist" containerID="58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.059569 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125"} err="failed to get container status \"58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125\": rpc error: code = NotFound desc = could not find container \"58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125\": container with ID starting with 58b7a6e9a0418351a37633a521ea82d3a6ff6992b0d650b39fcc6574f9eaf125 not found: ID does not exist" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.059620 4823 scope.go:117] "RemoveContainer" containerID="4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b" Dec 16 07:10:16 crc kubenswrapper[4823]: E1216 07:10:16.060093 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b\": container with ID starting with 4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b not found: ID does not exist" containerID="4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.060159 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b"} err="failed to get container status \"4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b\": rpc error: code = NotFound desc = could not find container \"4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b\": container with ID starting with 4bc5a9554d7ceefbc60cd63c5e2f3eb6f3449dfa41cd164afc32647a8072ba7b not found: ID does not exist" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.060205 4823 scope.go:117] "RemoveContainer" containerID="6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd" Dec 16 07:10:16 crc kubenswrapper[4823]: E1216 07:10:16.060600 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd\": container with ID starting with 6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd not found: ID does not exist" containerID="6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd" Dec 16 07:10:16 crc kubenswrapper[4823]: I1216 07:10:16.060660 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd"} err="failed to get container status \"6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd\": rpc error: code = NotFound desc = could not find container \"6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd\": container with ID starting with 6a450840db26581160bd5fd9ad06784fbe5e45050fe50fe1d5f772562f5e48dd not found: ID does not exist" Dec 16 07:10:17 crc kubenswrapper[4823]: I1216 07:10:17.780585 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" path="/var/lib/kubelet/pods/6ee8d317-17b9-40a4-981c-39fc5c2aff0e/volumes" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.508741 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7vbh4"] Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.509366 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="registry-server" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509387 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="registry-server" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.509409 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="extract-utilities" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509422 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="extract-utilities" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.509443 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="registry-server" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509458 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="registry-server" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.509478 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="extract-content" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509492 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="extract-content" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.509509 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="extract-content" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509521 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="extract-content" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.509542 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="extract-utilities" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509555 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="extract-utilities" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509761 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee8d317-17b9-40a4-981c-39fc5c2aff0e" containerName="registry-server" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.509791 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0072530b-0b12-4bb7-943e-48eae2c7f6b1" containerName="registry-server" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.510453 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.513777 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.514000 4823 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-7fjh7" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.516493 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.517619 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7vbh4"] Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.517657 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.587787 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/55889c97-d986-4a35-bbcf-af45ac5f9fe8-node-mnt\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.587950 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/55889c97-d986-4a35-bbcf-af45ac5f9fe8-crc-storage\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.588007 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcs4\" (UniqueName: \"kubernetes.io/projected/55889c97-d986-4a35-bbcf-af45ac5f9fe8-kube-api-access-mlcs4\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.655552 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwjhk"] Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.655900 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-controller" containerID="cri-o://304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.655986 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="nbdb" containerID="cri-o://0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.656110 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="northd" containerID="cri-o://055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.656099 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.656170 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="sbdb" containerID="cri-o://c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.656039 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-acl-logging" containerID="cri-o://c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.656010 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-node" containerID="cri-o://b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.689561 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/55889c97-d986-4a35-bbcf-af45ac5f9fe8-node-mnt\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.689613 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/55889c97-d986-4a35-bbcf-af45ac5f9fe8-crc-storage\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.689646 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcs4\" (UniqueName: \"kubernetes.io/projected/55889c97-d986-4a35-bbcf-af45ac5f9fe8-kube-api-access-mlcs4\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.689988 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/55889c97-d986-4a35-bbcf-af45ac5f9fe8-node-mnt\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.690778 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/55889c97-d986-4a35-bbcf-af45ac5f9fe8-crc-storage\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.692752 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" containerID="cri-o://cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" gracePeriod=30 Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.722160 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcs4\" (UniqueName: \"kubernetes.io/projected/55889c97-d986-4a35-bbcf-af45ac5f9fe8-kube-api-access-mlcs4\") pod \"crc-storage-crc-7vbh4\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: I1216 07:10:20.864323 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.893131 4823 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(d4a43ed653eaa9d8c2fc5af6b68bd1cdf91612879afd64dde486d00309281911): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.893189 4823 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(d4a43ed653eaa9d8c2fc5af6b68bd1cdf91612879afd64dde486d00309281911): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.893209 4823 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(d4a43ed653eaa9d8c2fc5af6b68bd1cdf91612879afd64dde486d00309281911): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:20 crc kubenswrapper[4823]: E1216 07:10:20.893255 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-7vbh4_crc-storage(55889c97-d986-4a35-bbcf-af45ac5f9fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-7vbh4_crc-storage(55889c97-d986-4a35-bbcf-af45ac5f9fe8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(d4a43ed653eaa9d8c2fc5af6b68bd1cdf91612879afd64dde486d00309281911): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-7vbh4" podUID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.010424 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/3.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.012842 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovn-acl-logging/0.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.013308 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovn-controller/0.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.013957 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.016210 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/2.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.017450 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/1.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.017485 4823 generic.go:334] "Generic (PLEG): container finished" podID="1b377757-dbc6-4d9c-9656-3ff65d7d113a" containerID="90088e0c301e42cba0bff78d22324f5a77b817c3f63e352985dd26abb4706970" exitCode=2 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.017539 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerDied","Data":"90088e0c301e42cba0bff78d22324f5a77b817c3f63e352985dd26abb4706970"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.017575 4823 scope.go:117] "RemoveContainer" containerID="93cfb9ff0c194231a3f99afaf3fb482684347346a20315de6c6513dc5dde8966" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.017998 4823 scope.go:117] "RemoveContainer" containerID="90088e0c301e42cba0bff78d22324f5a77b817c3f63e352985dd26abb4706970" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.026059 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovnkube-controller/3.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.028360 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovn-acl-logging/0.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.028999 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwjhk_08e48f89-7095-4ea2-afb5-759591c2b0d4/ovn-controller/0.log" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029552 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" exitCode=0 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029594 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" exitCode=0 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029603 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" exitCode=0 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029611 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" exitCode=0 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029617 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" exitCode=0 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029627 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" exitCode=0 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029633 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" exitCode=143 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029640 4823 generic.go:334] "Generic (PLEG): container finished" podID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" exitCode=143 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029648 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029649 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029699 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029703 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029718 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029734 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029753 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029764 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029775 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029780 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029786 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029793 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029798 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029803 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029807 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029812 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029817 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029824 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029832 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029838 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029843 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029848 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029853 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029858 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029863 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029869 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029874 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029879 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029887 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029897 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029906 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029914 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029921 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029928 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029934 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029942 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029949 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029956 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029963 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029971 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwjhk" event={"ID":"08e48f89-7095-4ea2-afb5-759591c2b0d4","Type":"ContainerDied","Data":"bc79567ff36b0deb62e06420030baecbdaf9941bbd19c29426be5e0056970c21"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029980 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029987 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029993 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.029999 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030003 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030008 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030013 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030018 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030038 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030043 4823 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.030542 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.062308 4823 scope.go:117] "RemoveContainer" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.078168 4823 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(122696c90c6d02b91f551007232291316b26a01b2db6c8ce2dff0e517a8e3e43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.078227 4823 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(122696c90c6d02b91f551007232291316b26a01b2db6c8ce2dff0e517a8e3e43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.078247 4823 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(122696c90c6d02b91f551007232291316b26a01b2db6c8ce2dff0e517a8e3e43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.078287 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-7vbh4_crc-storage(55889c97-d986-4a35-bbcf-af45ac5f9fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-7vbh4_crc-storage(55889c97-d986-4a35-bbcf-af45ac5f9fe8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-7vbh4_crc-storage_55889c97-d986-4a35-bbcf-af45ac5f9fe8_0(122696c90c6d02b91f551007232291316b26a01b2db6c8ce2dff0e517a8e3e43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-7vbh4" podUID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.080729 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whkwc"] Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.080952 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.080965 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.080977 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="northd" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.080984 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="northd" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.080992 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="nbdb" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.080998 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="nbdb" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081006 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081012 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081061 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081073 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081082 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081089 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081098 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-acl-logging" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081104 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-acl-logging" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081112 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="sbdb" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081118 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="sbdb" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081138 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-node" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081144 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-node" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081156 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kubecfg-setup" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081162 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kubecfg-setup" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081170 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081176 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081283 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081297 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="northd" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081310 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081315 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081321 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081328 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081335 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="nbdb" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081355 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovn-acl-logging" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081362 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="sbdb" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081369 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="kube-rbac-proxy-node" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081464 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081486 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.081494 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081500 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081600 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.081608 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" containerName="ovnkube-controller" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.083536 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093213 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093257 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-etc-openvswitch\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093288 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-systemd\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093307 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-netd\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093327 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-openvswitch\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093355 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-slash\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093381 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-env-overrides\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093396 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-netns\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093426 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-config\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093450 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-script-lib\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093483 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-systemd-units\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093500 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-ovn-kubernetes\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093517 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-ovn\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093531 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-node-log\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093544 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-log-socket\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093557 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-bin\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093580 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-var-lib-openvswitch\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093607 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzgs7\" (UniqueName: \"kubernetes.io/projected/08e48f89-7095-4ea2-afb5-759591c2b0d4-kube-api-access-qzgs7\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093721 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-kubelet\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093817 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovn-node-metrics-cert\") pod \"08e48f89-7095-4ea2-afb5-759591c2b0d4\" (UID: \"08e48f89-7095-4ea2-afb5-759591c2b0d4\") " Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093957 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-run-netns\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.093982 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-var-lib-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094002 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-slash\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094040 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-node-log\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094045 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094100 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094102 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094120 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094142 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-log-socket" (OuterVolumeSpecName: "log-socket") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094058 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-etc-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094149 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-node-log" (OuterVolumeSpecName: "node-log") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094193 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovnkube-config\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094217 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094228 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094238 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovnkube-script-lib\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094251 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094263 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wskjn\" (UniqueName: \"kubernetes.io/projected/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-kube-api-access-wskjn\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094270 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094305 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094322 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-env-overrides\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094342 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovn-node-metrics-cert\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094404 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-log-socket\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094423 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-systemd\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094452 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-cni-bin\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094482 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094498 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-ovn\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094523 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-kubelet\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094540 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-cni-netd\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094561 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-systemd-units\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094578 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094601 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-slash" (OuterVolumeSpecName: "host-slash") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094623 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094636 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094644 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.094926 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095227 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095234 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095412 4823 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095436 4823 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095453 4823 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095466 4823 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095479 4823 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095499 4823 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-node-log\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095516 4823 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-log-socket\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095558 4823 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.095581 4823 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.102238 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.102716 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.102879 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e48f89-7095-4ea2-afb5-759591c2b0d4-kube-api-access-qzgs7" (OuterVolumeSpecName: "kube-api-access-qzgs7") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "kube-api-access-qzgs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.117465 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "08e48f89-7095-4ea2-afb5-759591c2b0d4" (UID: "08e48f89-7095-4ea2-afb5-759591c2b0d4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.133771 4823 scope.go:117] "RemoveContainer" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.146355 4823 scope.go:117] "RemoveContainer" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.158523 4823 scope.go:117] "RemoveContainer" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.172120 4823 scope.go:117] "RemoveContainer" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.189398 4823 scope.go:117] "RemoveContainer" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198396 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-kubelet\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198457 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-cni-netd\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198518 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-systemd-units\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198551 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-run-netns\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198578 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-var-lib-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198598 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-slash\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198623 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-etc-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198619 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-kubelet\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198674 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-slash\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198709 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-node-log\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198722 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-cni-netd\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198721 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-var-lib-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198657 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-systemd-units\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198647 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-run-netns\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198685 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-etc-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198644 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-node-log\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198848 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovnkube-config\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198871 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198894 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovnkube-script-lib\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198900 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.198915 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wskjn\" (UniqueName: \"kubernetes.io/projected/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-kube-api-access-wskjn\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199015 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199122 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-env-overrides\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199142 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovn-node-metrics-cert\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199225 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-log-socket\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199277 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-systemd\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199301 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-cni-bin\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199344 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199371 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-ovn\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199428 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199442 4823 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199458 4823 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199471 4823 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-slash\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199484 4823 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199495 4823 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199510 4823 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199523 4823 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08e48f89-7095-4ea2-afb5-759591c2b0d4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199537 4823 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199550 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzgs7\" (UniqueName: \"kubernetes.io/projected/08e48f89-7095-4ea2-afb5-759591c2b0d4-kube-api-access-qzgs7\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199562 4823 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08e48f89-7095-4ea2-afb5-759591c2b0d4-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199536 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-openvswitch\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199607 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-systemd\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199643 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-cni-bin\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199675 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199678 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovnkube-script-lib\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199506 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-log-socket\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199717 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-run-ovn\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.199885 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovnkube-config\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.200120 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-env-overrides\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.205637 4823 scope.go:117] "RemoveContainer" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.207482 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-ovn-node-metrics-cert\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.216906 4823 scope.go:117] "RemoveContainer" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.218425 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wskjn\" (UniqueName: \"kubernetes.io/projected/61bfc4d2-409e-41f4-a92d-cce68ecfa1a6-kube-api-access-wskjn\") pod \"ovnkube-node-whkwc\" (UID: \"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.234250 4823 scope.go:117] "RemoveContainer" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.246047 4823 scope.go:117] "RemoveContainer" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.246449 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": container with ID starting with cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee not found: ID does not exist" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.246497 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} err="failed to get container status \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": rpc error: code = NotFound desc = could not find container \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": container with ID starting with cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.246525 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.246860 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": container with ID starting with 0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec not found: ID does not exist" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.246918 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} err="failed to get container status \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": rpc error: code = NotFound desc = could not find container \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": container with ID starting with 0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.246955 4823 scope.go:117] "RemoveContainer" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.247448 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": container with ID starting with c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28 not found: ID does not exist" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.247476 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} err="failed to get container status \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": rpc error: code = NotFound desc = could not find container \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": container with ID starting with c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.247494 4823 scope.go:117] "RemoveContainer" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.247784 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": container with ID starting with 0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6 not found: ID does not exist" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.247822 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} err="failed to get container status \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": rpc error: code = NotFound desc = could not find container \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": container with ID starting with 0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.247856 4823 scope.go:117] "RemoveContainer" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.248146 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": container with ID starting with 055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92 not found: ID does not exist" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.248185 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} err="failed to get container status \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": rpc error: code = NotFound desc = could not find container \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": container with ID starting with 055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.248205 4823 scope.go:117] "RemoveContainer" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.248448 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": container with ID starting with b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82 not found: ID does not exist" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.248482 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} err="failed to get container status \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": rpc error: code = NotFound desc = could not find container \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": container with ID starting with b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.248516 4823 scope.go:117] "RemoveContainer" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.248760 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": container with ID starting with b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1 not found: ID does not exist" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.248790 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} err="failed to get container status \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": rpc error: code = NotFound desc = could not find container \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": container with ID starting with b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.248813 4823 scope.go:117] "RemoveContainer" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.249074 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": container with ID starting with c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b not found: ID does not exist" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.249104 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} err="failed to get container status \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": rpc error: code = NotFound desc = could not find container \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": container with ID starting with c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.249138 4823 scope.go:117] "RemoveContainer" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.249444 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": container with ID starting with 304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20 not found: ID does not exist" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.249483 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} err="failed to get container status \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": rpc error: code = NotFound desc = could not find container \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": container with ID starting with 304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.249498 4823 scope.go:117] "RemoveContainer" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" Dec 16 07:10:21 crc kubenswrapper[4823]: E1216 07:10:21.249753 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": container with ID starting with 1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f not found: ID does not exist" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.249775 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} err="failed to get container status \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": rpc error: code = NotFound desc = could not find container \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": container with ID starting with 1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.249788 4823 scope.go:117] "RemoveContainer" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.250080 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} err="failed to get container status \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": rpc error: code = NotFound desc = could not find container \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": container with ID starting with cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.250106 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.250700 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} err="failed to get container status \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": rpc error: code = NotFound desc = could not find container \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": container with ID starting with 0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.250771 4823 scope.go:117] "RemoveContainer" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.251086 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} err="failed to get container status \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": rpc error: code = NotFound desc = could not find container \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": container with ID starting with c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.251126 4823 scope.go:117] "RemoveContainer" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.251401 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} err="failed to get container status \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": rpc error: code = NotFound desc = could not find container \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": container with ID starting with 0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.251427 4823 scope.go:117] "RemoveContainer" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.251663 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} err="failed to get container status \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": rpc error: code = NotFound desc = could not find container \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": container with ID starting with 055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.251704 4823 scope.go:117] "RemoveContainer" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252012 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} err="failed to get container status \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": rpc error: code = NotFound desc = could not find container \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": container with ID starting with b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252065 4823 scope.go:117] "RemoveContainer" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252351 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} err="failed to get container status \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": rpc error: code = NotFound desc = could not find container \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": container with ID starting with b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252480 4823 scope.go:117] "RemoveContainer" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252765 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} err="failed to get container status \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": rpc error: code = NotFound desc = could not find container \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": container with ID starting with c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252787 4823 scope.go:117] "RemoveContainer" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.252999 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} err="failed to get container status \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": rpc error: code = NotFound desc = could not find container \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": container with ID starting with 304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253043 4823 scope.go:117] "RemoveContainer" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253229 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} err="failed to get container status \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": rpc error: code = NotFound desc = could not find container \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": container with ID starting with 1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253276 4823 scope.go:117] "RemoveContainer" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253485 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} err="failed to get container status \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": rpc error: code = NotFound desc = could not find container \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": container with ID starting with cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253512 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253697 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} err="failed to get container status \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": rpc error: code = NotFound desc = could not find container \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": container with ID starting with 0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253718 4823 scope.go:117] "RemoveContainer" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253908 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} err="failed to get container status \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": rpc error: code = NotFound desc = could not find container \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": container with ID starting with c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.253936 4823 scope.go:117] "RemoveContainer" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254203 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} err="failed to get container status \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": rpc error: code = NotFound desc = could not find container \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": container with ID starting with 0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254246 4823 scope.go:117] "RemoveContainer" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254486 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} err="failed to get container status \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": rpc error: code = NotFound desc = could not find container \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": container with ID starting with 055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254511 4823 scope.go:117] "RemoveContainer" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254696 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} err="failed to get container status \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": rpc error: code = NotFound desc = could not find container \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": container with ID starting with b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254721 4823 scope.go:117] "RemoveContainer" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254917 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} err="failed to get container status \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": rpc error: code = NotFound desc = could not find container \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": container with ID starting with b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.254963 4823 scope.go:117] "RemoveContainer" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.255250 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} err="failed to get container status \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": rpc error: code = NotFound desc = could not find container \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": container with ID starting with c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.255290 4823 scope.go:117] "RemoveContainer" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.255507 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} err="failed to get container status \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": rpc error: code = NotFound desc = could not find container \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": container with ID starting with 304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.255545 4823 scope.go:117] "RemoveContainer" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.255765 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} err="failed to get container status \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": rpc error: code = NotFound desc = could not find container \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": container with ID starting with 1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.255800 4823 scope.go:117] "RemoveContainer" containerID="cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256018 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee"} err="failed to get container status \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": rpc error: code = NotFound desc = could not find container \"cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee\": container with ID starting with cc8a43f71797e49e9a777ee909b45ae50a30e76da5c4cd4c8ee62cd48a7917ee not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256062 4823 scope.go:117] "RemoveContainer" containerID="0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256317 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec"} err="failed to get container status \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": rpc error: code = NotFound desc = could not find container \"0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec\": container with ID starting with 0002b85e32dbacab67cb93ff21bf26130cbf98d65e88b13cfc00dc301fd4a3ec not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256358 4823 scope.go:117] "RemoveContainer" containerID="c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256617 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28"} err="failed to get container status \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": rpc error: code = NotFound desc = could not find container \"c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28\": container with ID starting with c223f5207e2445fd4bfd1bfe2c1c1ca50b5211a92b02c21445534fbfa2709e28 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256642 4823 scope.go:117] "RemoveContainer" containerID="0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256818 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6"} err="failed to get container status \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": rpc error: code = NotFound desc = could not find container \"0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6\": container with ID starting with 0f2cea674c0936cca4e9e95dfb7cd15bf9ccb3fbf8b711f351f87b632aad65c6 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.256838 4823 scope.go:117] "RemoveContainer" containerID="055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.257268 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92"} err="failed to get container status \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": rpc error: code = NotFound desc = could not find container \"055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92\": container with ID starting with 055585c1217ea720b929c9da82e7ea662d6fd5634244d362b3b425faee380d92 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.257300 4823 scope.go:117] "RemoveContainer" containerID="b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.257533 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82"} err="failed to get container status \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": rpc error: code = NotFound desc = could not find container \"b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82\": container with ID starting with b768d92db57918a41c53b83940957c0181865808eb2c795231a5f8dbbf2bde82 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.257555 4823 scope.go:117] "RemoveContainer" containerID="b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.257854 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1"} err="failed to get container status \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": rpc error: code = NotFound desc = could not find container \"b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1\": container with ID starting with b3d0f51f7ac1df7a9ae61306ccbfe2c93b397b6ab9ae0a477ab4eb46248d36f1 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.257883 4823 scope.go:117] "RemoveContainer" containerID="c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.258166 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b"} err="failed to get container status \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": rpc error: code = NotFound desc = could not find container \"c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b\": container with ID starting with c1c6cb023dcc6d1b149de201b108e3b1f3e32530dadc822aedede5552efc586b not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.258196 4823 scope.go:117] "RemoveContainer" containerID="304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.258409 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20"} err="failed to get container status \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": rpc error: code = NotFound desc = could not find container \"304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20\": container with ID starting with 304f9e1fee6b648f61b3b334284a954e8107ff70ca71d01b586d137490eeab20 not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.258445 4823 scope.go:117] "RemoveContainer" containerID="1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.258737 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f"} err="failed to get container status \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": rpc error: code = NotFound desc = could not find container \"1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f\": container with ID starting with 1e3566766c9c24f5c12481b74ed8a3938b21bdb36c003146515c67170d74a71f not found: ID does not exist" Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.372174 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwjhk"] Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.376603 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwjhk"] Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.404117 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:21 crc kubenswrapper[4823]: W1216 07:10:21.427235 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61bfc4d2_409e_41f4_a92d_cce68ecfa1a6.slice/crio-c92ca9376ddb52b083a4427e8543887f93db6759c69e00a99aa31d860fc2ff06 WatchSource:0}: Error finding container c92ca9376ddb52b083a4427e8543887f93db6759c69e00a99aa31d860fc2ff06: Status 404 returned error can't find the container with id c92ca9376ddb52b083a4427e8543887f93db6759c69e00a99aa31d860fc2ff06 Dec 16 07:10:21 crc kubenswrapper[4823]: I1216 07:10:21.779257 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e48f89-7095-4ea2-afb5-759591c2b0d4" path="/var/lib/kubelet/pods/08e48f89-7095-4ea2-afb5-759591c2b0d4/volumes" Dec 16 07:10:22 crc kubenswrapper[4823]: I1216 07:10:22.036484 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n248g_1b377757-dbc6-4d9c-9656-3ff65d7d113a/kube-multus/2.log" Dec 16 07:10:22 crc kubenswrapper[4823]: I1216 07:10:22.036591 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n248g" event={"ID":"1b377757-dbc6-4d9c-9656-3ff65d7d113a","Type":"ContainerStarted","Data":"9a333bd41e10c80f838248ebba7427375e7ce7fb5703ca8b576f3a64747def2a"} Dec 16 07:10:22 crc kubenswrapper[4823]: I1216 07:10:22.038766 4823 generic.go:334] "Generic (PLEG): container finished" podID="61bfc4d2-409e-41f4-a92d-cce68ecfa1a6" containerID="94bb798aa94f983bbf7b96887ba0f9bf1d7fad37866ffb5cdbce5f8218e7487e" exitCode=0 Dec 16 07:10:22 crc kubenswrapper[4823]: I1216 07:10:22.038841 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerDied","Data":"94bb798aa94f983bbf7b96887ba0f9bf1d7fad37866ffb5cdbce5f8218e7487e"} Dec 16 07:10:22 crc kubenswrapper[4823]: I1216 07:10:22.038890 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"c92ca9376ddb52b083a4427e8543887f93db6759c69e00a99aa31d860fc2ff06"} Dec 16 07:10:23 crc kubenswrapper[4823]: I1216 07:10:23.049098 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"b88a6fdaf271b5d2b28f200b699ff7659097e37a3a7388a4591beb03f4514c58"} Dec 16 07:10:23 crc kubenswrapper[4823]: I1216 07:10:23.049614 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"9d5906979e86177f416f2bae0e900e62bd9e5029afd7878e91f8d8ba1b2c1812"} Dec 16 07:10:23 crc kubenswrapper[4823]: I1216 07:10:23.049625 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"5e72bed073b1dfe52f6aea93066a1b751454926122340a8e2634dad9541873f7"} Dec 16 07:10:23 crc kubenswrapper[4823]: I1216 07:10:23.049634 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"ad320d588e490bbddb303e349dc45fa568d0679e80593f25d6e1232c714cc269"} Dec 16 07:10:23 crc kubenswrapper[4823]: I1216 07:10:23.049643 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"a80774c68bc249db6204cdace83de52650c9b19e7c548828a4decfd66604eb5a"} Dec 16 07:10:23 crc kubenswrapper[4823]: I1216 07:10:23.049651 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"f8757cec15aee6fe13c16db6842945d588e95d8f663a2068322d5158aa9e4a38"} Dec 16 07:10:25 crc kubenswrapper[4823]: I1216 07:10:25.065337 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"bdb4f18e53b3ae3c98981b5ce3141575b7dac5b42da268d45a6edb3a221c4460"} Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.083390 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" event={"ID":"61bfc4d2-409e-41f4-a92d-cce68ecfa1a6","Type":"ContainerStarted","Data":"e00bdd7e6bc383c1ca981c1c71b927078a7fc092cffe3b9bdba8db3207c80aad"} Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.083738 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.083752 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.083763 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.113377 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" podStartSLOduration=7.113358019 podStartE2EDuration="7.113358019s" podCreationTimestamp="2025-12-16 07:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:10:28.109909331 +0000 UTC m=+906.598475454" watchObservedRunningTime="2025-12-16 07:10:28.113358019 +0000 UTC m=+906.601924152" Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.114864 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.121104 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.133801 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:10:28 crc kubenswrapper[4823]: I1216 07:10:28.133868 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:10:34 crc kubenswrapper[4823]: I1216 07:10:34.771385 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:34 crc kubenswrapper[4823]: I1216 07:10:34.772793 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:34 crc kubenswrapper[4823]: I1216 07:10:34.977940 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7vbh4"] Dec 16 07:10:34 crc kubenswrapper[4823]: W1216 07:10:34.983868 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55889c97_d986_4a35_bbcf_af45ac5f9fe8.slice/crio-8775d38b01ce9ecb4875ef1351e47267dc4f3f8680319e69d600d50b8351733b WatchSource:0}: Error finding container 8775d38b01ce9ecb4875ef1351e47267dc4f3f8680319e69d600d50b8351733b: Status 404 returned error can't find the container with id 8775d38b01ce9ecb4875ef1351e47267dc4f3f8680319e69d600d50b8351733b Dec 16 07:10:35 crc kubenswrapper[4823]: I1216 07:10:35.120835 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7vbh4" event={"ID":"55889c97-d986-4a35-bbcf-af45ac5f9fe8","Type":"ContainerStarted","Data":"8775d38b01ce9ecb4875ef1351e47267dc4f3f8680319e69d600d50b8351733b"} Dec 16 07:10:37 crc kubenswrapper[4823]: I1216 07:10:37.133581 4823 generic.go:334] "Generic (PLEG): container finished" podID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" containerID="63017cb666caabb666f002c8b5a3b7bf50de129e0f932a0af8fdbeda724d4412" exitCode=0 Dec 16 07:10:37 crc kubenswrapper[4823]: I1216 07:10:37.133660 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7vbh4" event={"ID":"55889c97-d986-4a35-bbcf-af45ac5f9fe8","Type":"ContainerDied","Data":"63017cb666caabb666f002c8b5a3b7bf50de129e0f932a0af8fdbeda724d4412"} Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.376760 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.419421 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/55889c97-d986-4a35-bbcf-af45ac5f9fe8-node-mnt\") pod \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.419483 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/55889c97-d986-4a35-bbcf-af45ac5f9fe8-crc-storage\") pod \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.419507 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlcs4\" (UniqueName: \"kubernetes.io/projected/55889c97-d986-4a35-bbcf-af45ac5f9fe8-kube-api-access-mlcs4\") pod \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\" (UID: \"55889c97-d986-4a35-bbcf-af45ac5f9fe8\") " Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.419549 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55889c97-d986-4a35-bbcf-af45ac5f9fe8-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "55889c97-d986-4a35-bbcf-af45ac5f9fe8" (UID: "55889c97-d986-4a35-bbcf-af45ac5f9fe8"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.419666 4823 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/55889c97-d986-4a35-bbcf-af45ac5f9fe8-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.425012 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55889c97-d986-4a35-bbcf-af45ac5f9fe8-kube-api-access-mlcs4" (OuterVolumeSpecName: "kube-api-access-mlcs4") pod "55889c97-d986-4a35-bbcf-af45ac5f9fe8" (UID: "55889c97-d986-4a35-bbcf-af45ac5f9fe8"). InnerVolumeSpecName "kube-api-access-mlcs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.435933 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55889c97-d986-4a35-bbcf-af45ac5f9fe8-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "55889c97-d986-4a35-bbcf-af45ac5f9fe8" (UID: "55889c97-d986-4a35-bbcf-af45ac5f9fe8"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.521118 4823 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/55889c97-d986-4a35-bbcf-af45ac5f9fe8-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:38 crc kubenswrapper[4823]: I1216 07:10:38.521164 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlcs4\" (UniqueName: \"kubernetes.io/projected/55889c97-d986-4a35-bbcf-af45ac5f9fe8-kube-api-access-mlcs4\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:39 crc kubenswrapper[4823]: I1216 07:10:39.148220 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7vbh4" event={"ID":"55889c97-d986-4a35-bbcf-af45ac5f9fe8","Type":"ContainerDied","Data":"8775d38b01ce9ecb4875ef1351e47267dc4f3f8680319e69d600d50b8351733b"} Dec 16 07:10:39 crc kubenswrapper[4823]: I1216 07:10:39.148265 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8775d38b01ce9ecb4875ef1351e47267dc4f3f8680319e69d600d50b8351733b" Dec 16 07:10:39 crc kubenswrapper[4823]: I1216 07:10:39.148319 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7vbh4" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.117211 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc"] Dec 16 07:10:46 crc kubenswrapper[4823]: E1216 07:10:46.118890 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" containerName="storage" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.118998 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" containerName="storage" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.119204 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" containerName="storage" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.120079 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.122557 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.131151 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc"] Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.214674 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.214883 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr8qm\" (UniqueName: \"kubernetes.io/projected/2a1962cd-dfaf-404b-8feb-44ee984181a7-kube-api-access-sr8qm\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.215349 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.316314 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.316593 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.317047 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr8qm\" (UniqueName: \"kubernetes.io/projected/2a1962cd-dfaf-404b-8feb-44ee984181a7-kube-api-access-sr8qm\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.316979 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.316994 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.336746 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr8qm\" (UniqueName: \"kubernetes.io/projected/2a1962cd-dfaf-404b-8feb-44ee984181a7-kube-api-access-sr8qm\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.478728 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:46 crc kubenswrapper[4823]: I1216 07:10:46.689445 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc"] Dec 16 07:10:47 crc kubenswrapper[4823]: I1216 07:10:47.190983 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerID="6af2a204b3bab07fbe14d28f63207423bd7064848e31598b79278eda4cd7444c" exitCode=0 Dec 16 07:10:47 crc kubenswrapper[4823]: I1216 07:10:47.191067 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" event={"ID":"2a1962cd-dfaf-404b-8feb-44ee984181a7","Type":"ContainerDied","Data":"6af2a204b3bab07fbe14d28f63207423bd7064848e31598b79278eda4cd7444c"} Dec 16 07:10:47 crc kubenswrapper[4823]: I1216 07:10:47.191286 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" event={"ID":"2a1962cd-dfaf-404b-8feb-44ee984181a7","Type":"ContainerStarted","Data":"8f7f1020c6cc6074d5101a20e498c7b5b19aa8def37f7a81a4f66d4f441ad252"} Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.483110 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smd5r"] Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.494298 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.494696 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smd5r"] Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.547103 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-catalog-content\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.547523 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-utilities\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.547635 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l62\" (UniqueName: \"kubernetes.io/projected/930a1160-d9da-4174-b2c0-d38587589fe7-kube-api-access-p4l62\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.649005 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-catalog-content\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.649115 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-utilities\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.649139 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l62\" (UniqueName: \"kubernetes.io/projected/930a1160-d9da-4174-b2c0-d38587589fe7-kube-api-access-p4l62\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.649711 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-utilities\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.649716 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-catalog-content\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.668091 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l62\" (UniqueName: \"kubernetes.io/projected/930a1160-d9da-4174-b2c0-d38587589fe7-kube-api-access-p4l62\") pod \"redhat-operators-smd5r\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:48 crc kubenswrapper[4823]: I1216 07:10:48.821885 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:49 crc kubenswrapper[4823]: I1216 07:10:49.024404 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smd5r"] Dec 16 07:10:49 crc kubenswrapper[4823]: W1216 07:10:49.042717 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod930a1160_d9da_4174_b2c0_d38587589fe7.slice/crio-774fb78556cb785c7a1286e827892084acb7768a356d365c050a3098d9d0dc80 WatchSource:0}: Error finding container 774fb78556cb785c7a1286e827892084acb7768a356d365c050a3098d9d0dc80: Status 404 returned error can't find the container with id 774fb78556cb785c7a1286e827892084acb7768a356d365c050a3098d9d0dc80 Dec 16 07:10:49 crc kubenswrapper[4823]: I1216 07:10:49.203854 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerID="fb062579bd72a62543caf8e16fe65e6777ab19efaceba39c06890840a094c0a5" exitCode=0 Dec 16 07:10:49 crc kubenswrapper[4823]: I1216 07:10:49.203948 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" event={"ID":"2a1962cd-dfaf-404b-8feb-44ee984181a7","Type":"ContainerDied","Data":"fb062579bd72a62543caf8e16fe65e6777ab19efaceba39c06890840a094c0a5"} Dec 16 07:10:49 crc kubenswrapper[4823]: I1216 07:10:49.207341 4823 generic.go:334] "Generic (PLEG): container finished" podID="930a1160-d9da-4174-b2c0-d38587589fe7" containerID="effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230" exitCode=0 Dec 16 07:10:49 crc kubenswrapper[4823]: I1216 07:10:49.207381 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerDied","Data":"effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230"} Dec 16 07:10:49 crc kubenswrapper[4823]: I1216 07:10:49.207406 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerStarted","Data":"774fb78556cb785c7a1286e827892084acb7768a356d365c050a3098d9d0dc80"} Dec 16 07:10:50 crc kubenswrapper[4823]: I1216 07:10:50.214322 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerStarted","Data":"dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4"} Dec 16 07:10:50 crc kubenswrapper[4823]: I1216 07:10:50.218732 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerID="967cec468aa3947a1333da8bfa0fc79cb97a327ed1e984a611f997f9e8070a90" exitCode=0 Dec 16 07:10:50 crc kubenswrapper[4823]: I1216 07:10:50.218772 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" event={"ID":"2a1962cd-dfaf-404b-8feb-44ee984181a7","Type":"ContainerDied","Data":"967cec468aa3947a1333da8bfa0fc79cb97a327ed1e984a611f997f9e8070a90"} Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.227961 4823 generic.go:334] "Generic (PLEG): container finished" podID="930a1160-d9da-4174-b2c0-d38587589fe7" containerID="dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4" exitCode=0 Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.228099 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerDied","Data":"dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4"} Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.427812 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whkwc" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.438096 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.487102 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-bundle\") pod \"2a1962cd-dfaf-404b-8feb-44ee984181a7\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.487167 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr8qm\" (UniqueName: \"kubernetes.io/projected/2a1962cd-dfaf-404b-8feb-44ee984181a7-kube-api-access-sr8qm\") pod \"2a1962cd-dfaf-404b-8feb-44ee984181a7\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.487209 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-util\") pod \"2a1962cd-dfaf-404b-8feb-44ee984181a7\" (UID: \"2a1962cd-dfaf-404b-8feb-44ee984181a7\") " Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.488546 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-bundle" (OuterVolumeSpecName: "bundle") pod "2a1962cd-dfaf-404b-8feb-44ee984181a7" (UID: "2a1962cd-dfaf-404b-8feb-44ee984181a7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.492681 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1962cd-dfaf-404b-8feb-44ee984181a7-kube-api-access-sr8qm" (OuterVolumeSpecName: "kube-api-access-sr8qm") pod "2a1962cd-dfaf-404b-8feb-44ee984181a7" (UID: "2a1962cd-dfaf-404b-8feb-44ee984181a7"). InnerVolumeSpecName "kube-api-access-sr8qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.516530 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-util" (OuterVolumeSpecName: "util") pod "2a1962cd-dfaf-404b-8feb-44ee984181a7" (UID: "2a1962cd-dfaf-404b-8feb-44ee984181a7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.589389 4823 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.589793 4823 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a1962cd-dfaf-404b-8feb-44ee984181a7-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:51 crc kubenswrapper[4823]: I1216 07:10:51.589816 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr8qm\" (UniqueName: \"kubernetes.io/projected/2a1962cd-dfaf-404b-8feb-44ee984181a7-kube-api-access-sr8qm\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:52 crc kubenswrapper[4823]: I1216 07:10:52.237015 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" event={"ID":"2a1962cd-dfaf-404b-8feb-44ee984181a7","Type":"ContainerDied","Data":"8f7f1020c6cc6074d5101a20e498c7b5b19aa8def37f7a81a4f66d4f441ad252"} Dec 16 07:10:52 crc kubenswrapper[4823]: I1216 07:10:52.237096 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc" Dec 16 07:10:52 crc kubenswrapper[4823]: I1216 07:10:52.237102 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f7f1020c6cc6074d5101a20e498c7b5b19aa8def37f7a81a4f66d4f441ad252" Dec 16 07:10:52 crc kubenswrapper[4823]: I1216 07:10:52.242480 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerStarted","Data":"943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d"} Dec 16 07:10:52 crc kubenswrapper[4823]: I1216 07:10:52.276594 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smd5r" podStartSLOduration=1.769393031 podStartE2EDuration="4.276575169s" podCreationTimestamp="2025-12-16 07:10:48 +0000 UTC" firstStartedPulling="2025-12-16 07:10:49.210054377 +0000 UTC m=+927.698620500" lastFinishedPulling="2025-12-16 07:10:51.717236515 +0000 UTC m=+930.205802638" observedRunningTime="2025-12-16 07:10:52.27248441 +0000 UTC m=+930.761050543" watchObservedRunningTime="2025-12-16 07:10:52.276575169 +0000 UTC m=+930.765141302" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.170017 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-npfvt"] Dec 16 07:10:56 crc kubenswrapper[4823]: E1216 07:10:56.170619 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="extract" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.170637 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="extract" Dec 16 07:10:56 crc kubenswrapper[4823]: E1216 07:10:56.170647 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="util" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.170655 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="util" Dec 16 07:10:56 crc kubenswrapper[4823]: E1216 07:10:56.170673 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="pull" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.170681 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="pull" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.170788 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1962cd-dfaf-404b-8feb-44ee984181a7" containerName="extract" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.171278 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.173712 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.173772 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2b8df" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.180088 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.181730 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-npfvt"] Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.241576 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lspv\" (UniqueName: \"kubernetes.io/projected/0b896898-046a-48fa-bf48-cab19132c8e2-kube-api-access-7lspv\") pod \"nmstate-operator-6769fb99d-npfvt\" (UID: \"0b896898-046a-48fa-bf48-cab19132c8e2\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.343329 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lspv\" (UniqueName: \"kubernetes.io/projected/0b896898-046a-48fa-bf48-cab19132c8e2-kube-api-access-7lspv\") pod \"nmstate-operator-6769fb99d-npfvt\" (UID: \"0b896898-046a-48fa-bf48-cab19132c8e2\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.363554 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lspv\" (UniqueName: \"kubernetes.io/projected/0b896898-046a-48fa-bf48-cab19132c8e2-kube-api-access-7lspv\") pod \"nmstate-operator-6769fb99d-npfvt\" (UID: \"0b896898-046a-48fa-bf48-cab19132c8e2\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.485335 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" Dec 16 07:10:56 crc kubenswrapper[4823]: I1216 07:10:56.702642 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-npfvt"] Dec 16 07:10:57 crc kubenswrapper[4823]: I1216 07:10:57.267857 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" event={"ID":"0b896898-046a-48fa-bf48-cab19132c8e2","Type":"ContainerStarted","Data":"7340cc8595d38c29430bb77fba8ebb72a8b8bf12bad67dbd09416ca2d6a8fdef"} Dec 16 07:10:58 crc kubenswrapper[4823]: I1216 07:10:58.133872 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:10:58 crc kubenswrapper[4823]: I1216 07:10:58.134292 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:10:58 crc kubenswrapper[4823]: I1216 07:10:58.823084 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:58 crc kubenswrapper[4823]: I1216 07:10:58.823197 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:58 crc kubenswrapper[4823]: I1216 07:10:58.855910 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:10:59 crc kubenswrapper[4823]: I1216 07:10:59.329574 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:11:00 crc kubenswrapper[4823]: I1216 07:11:00.270532 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smd5r"] Dec 16 07:11:00 crc kubenswrapper[4823]: I1216 07:11:00.286286 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" event={"ID":"0b896898-046a-48fa-bf48-cab19132c8e2","Type":"ContainerStarted","Data":"c9746523c142a0f2ec1a309d71af81bf3ad6d330aad6f424a89fc2bc0b24dc07"} Dec 16 07:11:00 crc kubenswrapper[4823]: I1216 07:11:00.315270 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-npfvt" podStartSLOduration=1.673152889 podStartE2EDuration="4.315236135s" podCreationTimestamp="2025-12-16 07:10:56 +0000 UTC" firstStartedPulling="2025-12-16 07:10:56.715666484 +0000 UTC m=+935.204232607" lastFinishedPulling="2025-12-16 07:10:59.35774973 +0000 UTC m=+937.846315853" observedRunningTime="2025-12-16 07:11:00.308561203 +0000 UTC m=+938.797127376" watchObservedRunningTime="2025-12-16 07:11:00.315236135 +0000 UTC m=+938.803802258" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.297581 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smd5r" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="registry-server" containerID="cri-o://943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d" gracePeriod=2 Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.619011 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.706188 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l62\" (UniqueName: \"kubernetes.io/projected/930a1160-d9da-4174-b2c0-d38587589fe7-kube-api-access-p4l62\") pod \"930a1160-d9da-4174-b2c0-d38587589fe7\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.706243 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-utilities\") pod \"930a1160-d9da-4174-b2c0-d38587589fe7\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.706346 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-catalog-content\") pod \"930a1160-d9da-4174-b2c0-d38587589fe7\" (UID: \"930a1160-d9da-4174-b2c0-d38587589fe7\") " Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.707227 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-utilities" (OuterVolumeSpecName: "utilities") pod "930a1160-d9da-4174-b2c0-d38587589fe7" (UID: "930a1160-d9da-4174-b2c0-d38587589fe7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.712761 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930a1160-d9da-4174-b2c0-d38587589fe7-kube-api-access-p4l62" (OuterVolumeSpecName: "kube-api-access-p4l62") pod "930a1160-d9da-4174-b2c0-d38587589fe7" (UID: "930a1160-d9da-4174-b2c0-d38587589fe7"). InnerVolumeSpecName "kube-api-access-p4l62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.807417 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l62\" (UniqueName: \"kubernetes.io/projected/930a1160-d9da-4174-b2c0-d38587589fe7-kube-api-access-p4l62\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.807444 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.829255 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "930a1160-d9da-4174-b2c0-d38587589fe7" (UID: "930a1160-d9da-4174-b2c0-d38587589fe7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4823]: I1216 07:11:01.907987 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930a1160-d9da-4174-b2c0-d38587589fe7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.305459 4823 generic.go:334] "Generic (PLEG): container finished" podID="930a1160-d9da-4174-b2c0-d38587589fe7" containerID="943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d" exitCode=0 Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.305516 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerDied","Data":"943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d"} Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.305535 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smd5r" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.305563 4823 scope.go:117] "RemoveContainer" containerID="943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.305548 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smd5r" event={"ID":"930a1160-d9da-4174-b2c0-d38587589fe7","Type":"ContainerDied","Data":"774fb78556cb785c7a1286e827892084acb7768a356d365c050a3098d9d0dc80"} Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.321145 4823 scope.go:117] "RemoveContainer" containerID="dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.338884 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smd5r"] Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.342571 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smd5r"] Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.342653 4823 scope.go:117] "RemoveContainer" containerID="effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.371182 4823 scope.go:117] "RemoveContainer" containerID="943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d" Dec 16 07:11:02 crc kubenswrapper[4823]: E1216 07:11:02.371694 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d\": container with ID starting with 943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d not found: ID does not exist" containerID="943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.371739 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d"} err="failed to get container status \"943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d\": rpc error: code = NotFound desc = could not find container \"943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d\": container with ID starting with 943c99483c4459aa62cfbdaed3b0caeffe9e262da3e4d70299f4e4f30038cf5d not found: ID does not exist" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.371766 4823 scope.go:117] "RemoveContainer" containerID="dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4" Dec 16 07:11:02 crc kubenswrapper[4823]: E1216 07:11:02.372107 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4\": container with ID starting with dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4 not found: ID does not exist" containerID="dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.372140 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4"} err="failed to get container status \"dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4\": rpc error: code = NotFound desc = could not find container \"dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4\": container with ID starting with dcf1995626e9241beb797658679e16d7bdd2ab851162b73a7ab360c7145513f4 not found: ID does not exist" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.372166 4823 scope.go:117] "RemoveContainer" containerID="effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230" Dec 16 07:11:02 crc kubenswrapper[4823]: E1216 07:11:02.372584 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230\": container with ID starting with effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230 not found: ID does not exist" containerID="effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230" Dec 16 07:11:02 crc kubenswrapper[4823]: I1216 07:11:02.372626 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230"} err="failed to get container status \"effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230\": rpc error: code = NotFound desc = could not find container \"effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230\": container with ID starting with effdd05bfc083af34c8b5c19ec787742c20f1d2edd7949bedc8efd0065d0a230 not found: ID does not exist" Dec 16 07:11:03 crc kubenswrapper[4823]: I1216 07:11:03.777258 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" path="/var/lib/kubelet/pods/930a1160-d9da-4174-b2c0-d38587589fe7/volumes" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.213981 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-prt24"] Dec 16 07:11:06 crc kubenswrapper[4823]: E1216 07:11:06.214279 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="registry-server" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.214295 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="registry-server" Dec 16 07:11:06 crc kubenswrapper[4823]: E1216 07:11:06.214317 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="extract-content" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.214325 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="extract-content" Dec 16 07:11:06 crc kubenswrapper[4823]: E1216 07:11:06.214337 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="extract-utilities" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.214347 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="extract-utilities" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.214494 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="930a1160-d9da-4174-b2c0-d38587589fe7" containerName="registry-server" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.215200 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.219327 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7mc6h" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.226718 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4krvw"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.227611 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.230265 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.235716 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-prt24"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.278632 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kpqll"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.280075 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.291109 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4krvw"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.369073 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrs5j\" (UniqueName: \"kubernetes.io/projected/55511fba-6d01-4f25-af58-a1ea0e39bb95-kube-api-access-rrs5j\") pod \"nmstate-metrics-7f7f7578db-prt24\" (UID: \"55511fba-6d01-4f25-af58-a1ea0e39bb95\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.369176 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f13f0b02-300a-4220-9342-e1cae01493b3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4krvw\" (UID: \"f13f0b02-300a-4220-9342-e1cae01493b3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.369247 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpg9b\" (UniqueName: \"kubernetes.io/projected/f13f0b02-300a-4220-9342-e1cae01493b3-kube-api-access-qpg9b\") pod \"nmstate-webhook-f8fb84555-4krvw\" (UID: \"f13f0b02-300a-4220-9342-e1cae01493b3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.419526 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.420478 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.428601 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.429799 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.430062 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.430409 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-dqcjz" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.470706 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-nmstate-lock\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.470793 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slqj\" (UniqueName: \"kubernetes.io/projected/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-kube-api-access-4slqj\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.470838 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-dbus-socket\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.470871 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f13f0b02-300a-4220-9342-e1cae01493b3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4krvw\" (UID: \"f13f0b02-300a-4220-9342-e1cae01493b3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.470898 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrs5j\" (UniqueName: \"kubernetes.io/projected/55511fba-6d01-4f25-af58-a1ea0e39bb95-kube-api-access-rrs5j\") pod \"nmstate-metrics-7f7f7578db-prt24\" (UID: \"55511fba-6d01-4f25-af58-a1ea0e39bb95\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.470950 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-ovs-socket\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.472631 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpg9b\" (UniqueName: \"kubernetes.io/projected/f13f0b02-300a-4220-9342-e1cae01493b3-kube-api-access-qpg9b\") pod \"nmstate-webhook-f8fb84555-4krvw\" (UID: \"f13f0b02-300a-4220-9342-e1cae01493b3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.486920 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f13f0b02-300a-4220-9342-e1cae01493b3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-4krvw\" (UID: \"f13f0b02-300a-4220-9342-e1cae01493b3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.490415 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpg9b\" (UniqueName: \"kubernetes.io/projected/f13f0b02-300a-4220-9342-e1cae01493b3-kube-api-access-qpg9b\") pod \"nmstate-webhook-f8fb84555-4krvw\" (UID: \"f13f0b02-300a-4220-9342-e1cae01493b3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.492541 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrs5j\" (UniqueName: \"kubernetes.io/projected/55511fba-6d01-4f25-af58-a1ea0e39bb95-kube-api-access-rrs5j\") pod \"nmstate-metrics-7f7f7578db-prt24\" (UID: \"55511fba-6d01-4f25-af58-a1ea0e39bb95\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.537579 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.547208 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573540 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-dbus-socket\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573587 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mvn\" (UniqueName: \"kubernetes.io/projected/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-kube-api-access-l4mvn\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573626 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-ovs-socket\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573661 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573679 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-nmstate-lock\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573710 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573725 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-ovs-socket\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573731 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slqj\" (UniqueName: \"kubernetes.io/projected/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-kube-api-access-4slqj\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573844 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-nmstate-lock\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.573876 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-dbus-socket\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.594269 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slqj\" (UniqueName: \"kubernetes.io/projected/3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9-kube-api-access-4slqj\") pod \"nmstate-handler-kpqll\" (UID: \"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9\") " pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.596526 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7df677848d-qdcsc"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.597172 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.606097 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7df677848d-qdcsc"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.619980 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.677256 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.677589 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mvn\" (UniqueName: \"kubernetes.io/projected/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-kube-api-access-l4mvn\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.677717 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.678433 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.684816 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.696179 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mvn\" (UniqueName: \"kubernetes.io/projected/fa79bc1b-809d-4838-b3b1-be4a70b7fefa-kube-api-access-l4mvn\") pod \"nmstate-console-plugin-6ff7998486-shs4v\" (UID: \"fa79bc1b-809d-4838-b3b1-be4a70b7fefa\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.740503 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778499 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-oauth-serving-cert\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778580 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4m4\" (UniqueName: \"kubernetes.io/projected/e75f821b-bfd9-41d2-9600-ece2337a3b70-kube-api-access-xx4m4\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778603 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-service-ca\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778650 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-config\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778676 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-trusted-ca-bundle\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778730 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-serving-cert\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.778761 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-oauth-config\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.809544 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-4krvw"] Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880163 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-config\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880235 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-trusted-ca-bundle\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880300 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-serving-cert\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880358 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-oauth-config\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880382 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-oauth-serving-cert\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880414 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4m4\" (UniqueName: \"kubernetes.io/projected/e75f821b-bfd9-41d2-9600-ece2337a3b70-kube-api-access-xx4m4\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.880446 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-service-ca\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.881423 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-trusted-ca-bundle\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.881521 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-service-ca\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.881632 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-config\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.882833 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e75f821b-bfd9-41d2-9600-ece2337a3b70-oauth-serving-cert\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.885147 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-oauth-config\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.885670 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e75f821b-bfd9-41d2-9600-ece2337a3b70-console-serving-cert\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.896534 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4m4\" (UniqueName: \"kubernetes.io/projected/e75f821b-bfd9-41d2-9600-ece2337a3b70-kube-api-access-xx4m4\") pod \"console-7df677848d-qdcsc\" (UID: \"e75f821b-bfd9-41d2-9600-ece2337a3b70\") " pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.920730 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v"] Dec 16 07:11:06 crc kubenswrapper[4823]: W1216 07:11:06.925059 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa79bc1b_809d_4838_b3b1_be4a70b7fefa.slice/crio-997cd4e05289110af88f328af43dcd494038dc7578291d410a5d67a5431ea167 WatchSource:0}: Error finding container 997cd4e05289110af88f328af43dcd494038dc7578291d410a5d67a5431ea167: Status 404 returned error can't find the container with id 997cd4e05289110af88f328af43dcd494038dc7578291d410a5d67a5431ea167 Dec 16 07:11:06 crc kubenswrapper[4823]: I1216 07:11:06.930096 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:07 crc kubenswrapper[4823]: I1216 07:11:07.047593 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-prt24"] Dec 16 07:11:07 crc kubenswrapper[4823]: W1216 07:11:07.086508 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55511fba_6d01_4f25_af58_a1ea0e39bb95.slice/crio-41857e7ece7e7dc1bc0d497b005572b458b3ce1d34e2fbb43da6105fe4fa14cf WatchSource:0}: Error finding container 41857e7ece7e7dc1bc0d497b005572b458b3ce1d34e2fbb43da6105fe4fa14cf: Status 404 returned error can't find the container with id 41857e7ece7e7dc1bc0d497b005572b458b3ce1d34e2fbb43da6105fe4fa14cf Dec 16 07:11:07 crc kubenswrapper[4823]: I1216 07:11:07.339116 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" event={"ID":"55511fba-6d01-4f25-af58-a1ea0e39bb95","Type":"ContainerStarted","Data":"41857e7ece7e7dc1bc0d497b005572b458b3ce1d34e2fbb43da6105fe4fa14cf"} Dec 16 07:11:07 crc kubenswrapper[4823]: I1216 07:11:07.340513 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" event={"ID":"f13f0b02-300a-4220-9342-e1cae01493b3","Type":"ContainerStarted","Data":"5e5319a10e49696359095aa0dc7773ec6ec3278ba89cb116f15298e4d9cc4476"} Dec 16 07:11:07 crc kubenswrapper[4823]: I1216 07:11:07.341857 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kpqll" event={"ID":"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9","Type":"ContainerStarted","Data":"af1ff3d696632829ccb21d1e28fae4b207d3e64d71f593b1160e989102227516"} Dec 16 07:11:07 crc kubenswrapper[4823]: I1216 07:11:07.342842 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" event={"ID":"fa79bc1b-809d-4838-b3b1-be4a70b7fefa","Type":"ContainerStarted","Data":"997cd4e05289110af88f328af43dcd494038dc7578291d410a5d67a5431ea167"} Dec 16 07:11:07 crc kubenswrapper[4823]: I1216 07:11:07.372169 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7df677848d-qdcsc"] Dec 16 07:11:07 crc kubenswrapper[4823]: W1216 07:11:07.376146 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode75f821b_bfd9_41d2_9600_ece2337a3b70.slice/crio-3f6170a0303c98aa1b4583255fa918ae19c86e6ac750f0d4ec2e17611f11c488 WatchSource:0}: Error finding container 3f6170a0303c98aa1b4583255fa918ae19c86e6ac750f0d4ec2e17611f11c488: Status 404 returned error can't find the container with id 3f6170a0303c98aa1b4583255fa918ae19c86e6ac750f0d4ec2e17611f11c488 Dec 16 07:11:08 crc kubenswrapper[4823]: I1216 07:11:08.350321 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7df677848d-qdcsc" event={"ID":"e75f821b-bfd9-41d2-9600-ece2337a3b70","Type":"ContainerStarted","Data":"a34d10ab05d9b6e005758b9dad389b9146ce66adeb613aab9e89351ec92108cd"} Dec 16 07:11:08 crc kubenswrapper[4823]: I1216 07:11:08.350891 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7df677848d-qdcsc" event={"ID":"e75f821b-bfd9-41d2-9600-ece2337a3b70","Type":"ContainerStarted","Data":"3f6170a0303c98aa1b4583255fa918ae19c86e6ac750f0d4ec2e17611f11c488"} Dec 16 07:11:08 crc kubenswrapper[4823]: I1216 07:11:08.375860 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7df677848d-qdcsc" podStartSLOduration=2.375835107 podStartE2EDuration="2.375835107s" podCreationTimestamp="2025-12-16 07:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:08.369978962 +0000 UTC m=+946.858545115" watchObservedRunningTime="2025-12-16 07:11:08.375835107 +0000 UTC m=+946.864401230" Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.365369 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" event={"ID":"55511fba-6d01-4f25-af58-a1ea0e39bb95","Type":"ContainerStarted","Data":"4b6c03dbb3f5bd71bf8cdd753483eb04487e4ecfb83e9eafa6ae6c0dc0559116"} Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.367609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" event={"ID":"f13f0b02-300a-4220-9342-e1cae01493b3","Type":"ContainerStarted","Data":"82c52e0435f2e591a3a4bc799b8faa12019f792c37f4d26d7ef65f33c0165244"} Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.367729 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.369105 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kpqll" event={"ID":"3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9","Type":"ContainerStarted","Data":"e59b70b886bef91891a2bff87df758bbdd9fd6d2a405246ba0b69c1c1a6c6362"} Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.369193 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.370368 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" event={"ID":"fa79bc1b-809d-4838-b3b1-be4a70b7fefa","Type":"ContainerStarted","Data":"7adf398372b34c7c677d20fe8d5a2a7cc4eace80616f70a448412e246cf73e3e"} Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.384484 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" podStartSLOduration=1.596621356 podStartE2EDuration="4.384465998s" podCreationTimestamp="2025-12-16 07:11:06 +0000 UTC" firstStartedPulling="2025-12-16 07:11:06.800347076 +0000 UTC m=+945.288913199" lastFinishedPulling="2025-12-16 07:11:09.588191708 +0000 UTC m=+948.076757841" observedRunningTime="2025-12-16 07:11:10.383542789 +0000 UTC m=+948.872108912" watchObservedRunningTime="2025-12-16 07:11:10.384465998 +0000 UTC m=+948.873032121" Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.406545 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kpqll" podStartSLOduration=1.489571551 podStartE2EDuration="4.406528548s" podCreationTimestamp="2025-12-16 07:11:06 +0000 UTC" firstStartedPulling="2025-12-16 07:11:06.658510057 +0000 UTC m=+945.147076180" lastFinishedPulling="2025-12-16 07:11:09.575467054 +0000 UTC m=+948.064033177" observedRunningTime="2025-12-16 07:11:10.402621604 +0000 UTC m=+948.891187727" watchObservedRunningTime="2025-12-16 07:11:10.406528548 +0000 UTC m=+948.895094671" Dec 16 07:11:10 crc kubenswrapper[4823]: I1216 07:11:10.418773 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-shs4v" podStartSLOduration=1.774105777 podStartE2EDuration="4.418753516s" podCreationTimestamp="2025-12-16 07:11:06 +0000 UTC" firstStartedPulling="2025-12-16 07:11:06.926280191 +0000 UTC m=+945.414846314" lastFinishedPulling="2025-12-16 07:11:09.57092793 +0000 UTC m=+948.059494053" observedRunningTime="2025-12-16 07:11:10.417506346 +0000 UTC m=+948.906072479" watchObservedRunningTime="2025-12-16 07:11:10.418753516 +0000 UTC m=+948.907319649" Dec 16 07:11:12 crc kubenswrapper[4823]: I1216 07:11:12.385983 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" event={"ID":"55511fba-6d01-4f25-af58-a1ea0e39bb95","Type":"ContainerStarted","Data":"0e0b7952a7e0577cb3c8a834af876a853e204ec3627c36ab5c56898bcc6ab83e"} Dec 16 07:11:12 crc kubenswrapper[4823]: I1216 07:11:12.405766 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-prt24" podStartSLOduration=1.606836822 podStartE2EDuration="6.405749401s" podCreationTimestamp="2025-12-16 07:11:06 +0000 UTC" firstStartedPulling="2025-12-16 07:11:07.095281853 +0000 UTC m=+945.583847976" lastFinishedPulling="2025-12-16 07:11:11.894194432 +0000 UTC m=+950.382760555" observedRunningTime="2025-12-16 07:11:12.405257605 +0000 UTC m=+950.893823738" watchObservedRunningTime="2025-12-16 07:11:12.405749401 +0000 UTC m=+950.894315524" Dec 16 07:11:16 crc kubenswrapper[4823]: I1216 07:11:16.649784 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kpqll" Dec 16 07:11:16 crc kubenswrapper[4823]: I1216 07:11:16.931101 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:16 crc kubenswrapper[4823]: I1216 07:11:16.931466 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:16 crc kubenswrapper[4823]: I1216 07:11:16.939657 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:17 crc kubenswrapper[4823]: I1216 07:11:17.416757 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7df677848d-qdcsc" Dec 16 07:11:17 crc kubenswrapper[4823]: I1216 07:11:17.496498 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bx552"] Dec 16 07:11:26 crc kubenswrapper[4823]: I1216 07:11:26.555492 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-4krvw" Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.134425 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.134771 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.134823 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.135530 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48219d3c0e584aed9d175a58b4a139883d9d4f8a627e33b1552f22d85e485c5c"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.135590 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://48219d3c0e584aed9d175a58b4a139883d9d4f8a627e33b1552f22d85e485c5c" gracePeriod=600 Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.486484 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="48219d3c0e584aed9d175a58b4a139883d9d4f8a627e33b1552f22d85e485c5c" exitCode=0 Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.486532 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"48219d3c0e584aed9d175a58b4a139883d9d4f8a627e33b1552f22d85e485c5c"} Dec 16 07:11:28 crc kubenswrapper[4823]: I1216 07:11:28.486569 4823 scope.go:117] "RemoveContainer" containerID="b2c9252cf9a9f07ff8ef785fb449e76ce7b1db97459e712ab514f7ac68ccf0cb" Dec 16 07:11:29 crc kubenswrapper[4823]: I1216 07:11:29.499299 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"c07a7c4faebf9ec795cca9e8449add482643e386f41ece163e5f5944f0d37df3"} Dec 16 07:11:40 crc kubenswrapper[4823]: I1216 07:11:40.959774 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w"] Dec 16 07:11:40 crc kubenswrapper[4823]: I1216 07:11:40.962549 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:40 crc kubenswrapper[4823]: I1216 07:11:40.965855 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 07:11:40 crc kubenswrapper[4823]: I1216 07:11:40.967763 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w"] Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.132635 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.133658 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4fnj\" (UniqueName: \"kubernetes.io/projected/3dd04c34-7f81-4773-a28b-0660e24aeb5d-kube-api-access-j4fnj\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.133930 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.234905 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.235272 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.235400 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4fnj\" (UniqueName: \"kubernetes.io/projected/3dd04c34-7f81-4773-a28b-0660e24aeb5d-kube-api-access-j4fnj\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.235455 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.235760 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.254165 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4fnj\" (UniqueName: \"kubernetes.io/projected/3dd04c34-7f81-4773-a28b-0660e24aeb5d-kube-api-access-j4fnj\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.288517 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.493636 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w"] Dec 16 07:11:41 crc kubenswrapper[4823]: I1216 07:11:41.565289 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" event={"ID":"3dd04c34-7f81-4773-a28b-0660e24aeb5d","Type":"ContainerStarted","Data":"82362ebd9f367b8bdabe7cc048d5f11724243cbb48b7f9140879e6a8054e0e23"} Dec 16 07:11:42 crc kubenswrapper[4823]: I1216 07:11:42.560858 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bx552" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" containerName="console" containerID="cri-o://f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913" gracePeriod=15 Dec 16 07:11:42 crc kubenswrapper[4823]: I1216 07:11:42.571578 4823 generic.go:334] "Generic (PLEG): container finished" podID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerID="5ddfe17a7c423e2e855a820febdfe9715a3b5858903b64ba22dea5cfdfc01bf6" exitCode=0 Dec 16 07:11:42 crc kubenswrapper[4823]: I1216 07:11:42.571688 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" event={"ID":"3dd04c34-7f81-4773-a28b-0660e24aeb5d","Type":"ContainerDied","Data":"5ddfe17a7c423e2e855a820febdfe9715a3b5858903b64ba22dea5cfdfc01bf6"} Dec 16 07:11:42 crc kubenswrapper[4823]: I1216 07:11:42.907219 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bx552_e1c6d0f7-5a86-49fb-870d-991796812348/console/0.log" Dec 16 07:11:42 crc kubenswrapper[4823]: I1216 07:11:42.907281 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bx552" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.060813 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwbjs\" (UniqueName: \"kubernetes.io/projected/e1c6d0f7-5a86-49fb-870d-991796812348-kube-api-access-rwbjs\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.060904 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-service-ca\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.060981 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-trusted-ca-bundle\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.061011 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-oauth-serving-cert\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.061091 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-oauth-config\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.061185 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-serving-cert\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.061242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-console-config\") pod \"e1c6d0f7-5a86-49fb-870d-991796812348\" (UID: \"e1c6d0f7-5a86-49fb-870d-991796812348\") " Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.061877 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.061896 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-service-ca" (OuterVolumeSpecName: "service-ca") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.062015 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-console-config" (OuterVolumeSpecName: "console-config") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.062668 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.075564 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c6d0f7-5a86-49fb-870d-991796812348-kube-api-access-rwbjs" (OuterVolumeSpecName: "kube-api-access-rwbjs") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "kube-api-access-rwbjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.076581 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.077235 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e1c6d0f7-5a86-49fb-870d-991796812348" (UID: "e1c6d0f7-5a86-49fb-870d-991796812348"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162664 4823 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162737 4823 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162766 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwbjs\" (UniqueName: \"kubernetes.io/projected/e1c6d0f7-5a86-49fb-870d-991796812348-kube-api-access-rwbjs\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162795 4823 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162820 4823 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162843 4823 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1c6d0f7-5a86-49fb-870d-991796812348-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.162866 4823 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1c6d0f7-5a86-49fb-870d-991796812348-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.583244 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bx552_e1c6d0f7-5a86-49fb-870d-991796812348/console/0.log" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.583329 4823 generic.go:334] "Generic (PLEG): container finished" podID="e1c6d0f7-5a86-49fb-870d-991796812348" containerID="f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913" exitCode=2 Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.583378 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bx552" event={"ID":"e1c6d0f7-5a86-49fb-870d-991796812348","Type":"ContainerDied","Data":"f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913"} Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.583421 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bx552" event={"ID":"e1c6d0f7-5a86-49fb-870d-991796812348","Type":"ContainerDied","Data":"edb82a2a1350064ecff291dc689957660c671b14b7e3ebff243b4f9a4e7e3c89"} Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.583453 4823 scope.go:117] "RemoveContainer" containerID="f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.583651 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bx552" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.609528 4823 scope.go:117] "RemoveContainer" containerID="f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913" Dec 16 07:11:43 crc kubenswrapper[4823]: E1216 07:11:43.609959 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913\": container with ID starting with f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913 not found: ID does not exist" containerID="f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.610099 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913"} err="failed to get container status \"f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913\": rpc error: code = NotFound desc = could not find container \"f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913\": container with ID starting with f6e12b2fac1db30605e7b6326bc5d0014d7736cae9a8b5fca123cb2fdf5c6913 not found: ID does not exist" Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.623247 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bx552"] Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.628324 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bx552"] Dec 16 07:11:43 crc kubenswrapper[4823]: I1216 07:11:43.778772 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" path="/var/lib/kubelet/pods/e1c6d0f7-5a86-49fb-870d-991796812348/volumes" Dec 16 07:11:44 crc kubenswrapper[4823]: I1216 07:11:44.589995 4823 generic.go:334] "Generic (PLEG): container finished" podID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerID="450928111aad32dce19619bf427f9ba3c55c5dfd4e1aac9d6bed48ae165bea7a" exitCode=0 Dec 16 07:11:44 crc kubenswrapper[4823]: I1216 07:11:44.590076 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" event={"ID":"3dd04c34-7f81-4773-a28b-0660e24aeb5d","Type":"ContainerDied","Data":"450928111aad32dce19619bf427f9ba3c55c5dfd4e1aac9d6bed48ae165bea7a"} Dec 16 07:11:45 crc kubenswrapper[4823]: I1216 07:11:45.598119 4823 generic.go:334] "Generic (PLEG): container finished" podID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerID="0e26f4c75c4dd2850a502dfd59dc7973165a4de5f5689fd51c20872f0c1203aa" exitCode=0 Dec 16 07:11:45 crc kubenswrapper[4823]: I1216 07:11:45.598162 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" event={"ID":"3dd04c34-7f81-4773-a28b-0660e24aeb5d","Type":"ContainerDied","Data":"0e26f4c75c4dd2850a502dfd59dc7973165a4de5f5689fd51c20872f0c1203aa"} Dec 16 07:11:46 crc kubenswrapper[4823]: I1216 07:11:46.826257 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:46 crc kubenswrapper[4823]: I1216 07:11:46.915053 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4fnj\" (UniqueName: \"kubernetes.io/projected/3dd04c34-7f81-4773-a28b-0660e24aeb5d-kube-api-access-j4fnj\") pod \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " Dec 16 07:11:46 crc kubenswrapper[4823]: I1216 07:11:46.915120 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-util\") pod \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " Dec 16 07:11:46 crc kubenswrapper[4823]: I1216 07:11:46.915274 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-bundle\") pod \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\" (UID: \"3dd04c34-7f81-4773-a28b-0660e24aeb5d\") " Dec 16 07:11:46 crc kubenswrapper[4823]: I1216 07:11:46.916651 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-bundle" (OuterVolumeSpecName: "bundle") pod "3dd04c34-7f81-4773-a28b-0660e24aeb5d" (UID: "3dd04c34-7f81-4773-a28b-0660e24aeb5d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:11:46 crc kubenswrapper[4823]: I1216 07:11:46.920317 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd04c34-7f81-4773-a28b-0660e24aeb5d-kube-api-access-j4fnj" (OuterVolumeSpecName: "kube-api-access-j4fnj") pod "3dd04c34-7f81-4773-a28b-0660e24aeb5d" (UID: "3dd04c34-7f81-4773-a28b-0660e24aeb5d"). InnerVolumeSpecName "kube-api-access-j4fnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.017073 4823 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.017162 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4fnj\" (UniqueName: \"kubernetes.io/projected/3dd04c34-7f81-4773-a28b-0660e24aeb5d-kube-api-access-j4fnj\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.029781 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-util" (OuterVolumeSpecName: "util") pod "3dd04c34-7f81-4773-a28b-0660e24aeb5d" (UID: "3dd04c34-7f81-4773-a28b-0660e24aeb5d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.117767 4823 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3dd04c34-7f81-4773-a28b-0660e24aeb5d-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.609920 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" event={"ID":"3dd04c34-7f81-4773-a28b-0660e24aeb5d","Type":"ContainerDied","Data":"82362ebd9f367b8bdabe7cc048d5f11724243cbb48b7f9140879e6a8054e0e23"} Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.610267 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82362ebd9f367b8bdabe7cc048d5f11724243cbb48b7f9140879e6a8054e0e23" Dec 16 07:11:47 crc kubenswrapper[4823]: I1216 07:11:47.609981 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.917039 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb"] Dec 16 07:11:55 crc kubenswrapper[4823]: E1216 07:11:55.917863 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="util" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.917878 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="util" Dec 16 07:11:55 crc kubenswrapper[4823]: E1216 07:11:55.917892 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="pull" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.917900 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="pull" Dec 16 07:11:55 crc kubenswrapper[4823]: E1216 07:11:55.917915 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" containerName="console" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.917923 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" containerName="console" Dec 16 07:11:55 crc kubenswrapper[4823]: E1216 07:11:55.917938 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="extract" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.917948 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="extract" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.918099 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c6d0f7-5a86-49fb-870d-991796812348" containerName="console" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.918123 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd04c34-7f81-4773-a28b-0660e24aeb5d" containerName="extract" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.918576 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.920891 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.921010 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.922051 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.922197 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.922269 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mrhsj" Dec 16 07:11:55 crc kubenswrapper[4823]: I1216 07:11:55.947296 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb"] Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.024617 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-apiservice-cert\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.024671 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27psk\" (UniqueName: \"kubernetes.io/projected/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-kube-api-access-27psk\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.024934 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-webhook-cert\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.126494 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-webhook-cert\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.126578 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-apiservice-cert\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.126603 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27psk\" (UniqueName: \"kubernetes.io/projected/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-kube-api-access-27psk\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.137077 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-apiservice-cert\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.137182 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-webhook-cert\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.142605 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27psk\" (UniqueName: \"kubernetes.io/projected/0c6a98e7-03f3-402e-ae6e-18c7b2a09ead-kube-api-access-27psk\") pod \"metallb-operator-controller-manager-5d4c58b6db-plllb\" (UID: \"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead\") " pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.174969 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4"] Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.175735 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.178359 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.179049 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.179654 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hnf9m" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.207799 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4"] Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.236461 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.329321 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e3777ec-c803-4417-8381-86fb3ad02265-apiservice-cert\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.329678 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbm9\" (UniqueName: \"kubernetes.io/projected/7e3777ec-c803-4417-8381-86fb3ad02265-kube-api-access-hzbm9\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.329759 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e3777ec-c803-4417-8381-86fb3ad02265-webhook-cert\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.430554 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e3777ec-c803-4417-8381-86fb3ad02265-webhook-cert\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.430618 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e3777ec-c803-4417-8381-86fb3ad02265-apiservice-cert\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.430649 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbm9\" (UniqueName: \"kubernetes.io/projected/7e3777ec-c803-4417-8381-86fb3ad02265-kube-api-access-hzbm9\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.436578 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e3777ec-c803-4417-8381-86fb3ad02265-apiservice-cert\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.456852 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e3777ec-c803-4417-8381-86fb3ad02265-webhook-cert\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.457353 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbm9\" (UniqueName: \"kubernetes.io/projected/7e3777ec-c803-4417-8381-86fb3ad02265-kube-api-access-hzbm9\") pod \"metallb-operator-webhook-server-86d44cc785-ftsr4\" (UID: \"7e3777ec-c803-4417-8381-86fb3ad02265\") " pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.505515 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.681923 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb"] Dec 16 07:11:56 crc kubenswrapper[4823]: W1216 07:11:56.682136 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6a98e7_03f3_402e_ae6e_18c7b2a09ead.slice/crio-05600ed42fec31fd298b55273d4268bd034d68e7de1aa3fa224fe44d3c19db08 WatchSource:0}: Error finding container 05600ed42fec31fd298b55273d4268bd034d68e7de1aa3fa224fe44d3c19db08: Status 404 returned error can't find the container with id 05600ed42fec31fd298b55273d4268bd034d68e7de1aa3fa224fe44d3c19db08 Dec 16 07:11:56 crc kubenswrapper[4823]: I1216 07:11:56.943812 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4"] Dec 16 07:11:56 crc kubenswrapper[4823]: W1216 07:11:56.947116 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e3777ec_c803_4417_8381_86fb3ad02265.slice/crio-33e5168730142239061e65a16296922f629955b076e483c89fc414d8b5a8bb7f WatchSource:0}: Error finding container 33e5168730142239061e65a16296922f629955b076e483c89fc414d8b5a8bb7f: Status 404 returned error can't find the container with id 33e5168730142239061e65a16296922f629955b076e483c89fc414d8b5a8bb7f Dec 16 07:11:57 crc kubenswrapper[4823]: I1216 07:11:57.659575 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" event={"ID":"7e3777ec-c803-4417-8381-86fb3ad02265","Type":"ContainerStarted","Data":"33e5168730142239061e65a16296922f629955b076e483c89fc414d8b5a8bb7f"} Dec 16 07:11:57 crc kubenswrapper[4823]: I1216 07:11:57.660958 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" event={"ID":"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead","Type":"ContainerStarted","Data":"05600ed42fec31fd298b55273d4268bd034d68e7de1aa3fa224fe44d3c19db08"} Dec 16 07:12:00 crc kubenswrapper[4823]: I1216 07:12:00.698537 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" event={"ID":"0c6a98e7-03f3-402e-ae6e-18c7b2a09ead","Type":"ContainerStarted","Data":"9d356019cc450d79ecc78fbdd01650c4971785b90ec2207b1d39fb4fe42a8cda"} Dec 16 07:12:00 crc kubenswrapper[4823]: I1216 07:12:00.699003 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:12:00 crc kubenswrapper[4823]: I1216 07:12:00.725753 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" podStartSLOduration=2.7278411030000003 podStartE2EDuration="5.725733212s" podCreationTimestamp="2025-12-16 07:11:55 +0000 UTC" firstStartedPulling="2025-12-16 07:11:56.684350838 +0000 UTC m=+995.172916961" lastFinishedPulling="2025-12-16 07:11:59.682242957 +0000 UTC m=+998.170809070" observedRunningTime="2025-12-16 07:12:00.721301493 +0000 UTC m=+999.209867636" watchObservedRunningTime="2025-12-16 07:12:00.725733212 +0000 UTC m=+999.214299345" Dec 16 07:12:01 crc kubenswrapper[4823]: I1216 07:12:01.706082 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" event={"ID":"7e3777ec-c803-4417-8381-86fb3ad02265","Type":"ContainerStarted","Data":"6e279ff9f29e217f63b95372c9af538a40bd14e0ee69ce8ac0896e1d616a0204"} Dec 16 07:12:01 crc kubenswrapper[4823]: I1216 07:12:01.724885 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" podStartSLOduration=1.637426182 podStartE2EDuration="5.724842203s" podCreationTimestamp="2025-12-16 07:11:56 +0000 UTC" firstStartedPulling="2025-12-16 07:11:56.949546232 +0000 UTC m=+995.438112355" lastFinishedPulling="2025-12-16 07:12:01.036962253 +0000 UTC m=+999.525528376" observedRunningTime="2025-12-16 07:12:01.721940392 +0000 UTC m=+1000.210506515" watchObservedRunningTime="2025-12-16 07:12:01.724842203 +0000 UTC m=+1000.213408326" Dec 16 07:12:02 crc kubenswrapper[4823]: I1216 07:12:02.714178 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:12:16 crc kubenswrapper[4823]: I1216 07:12:16.512868 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-86d44cc785-ftsr4" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.240340 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5d4c58b6db-plllb" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.941794 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-sdgmf"] Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.944905 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.947163 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n5j45" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.947434 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.949054 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.951589 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm"] Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.952508 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.954214 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm"] Dec 16 07:12:36 crc kubenswrapper[4823]: I1216 07:12:36.957334 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.019798 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-cn66r"] Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.020774 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.022633 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.036835 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dftnv"] Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.038729 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.046647 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wvjmn" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.046826 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.046950 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.046960 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.047573 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-cn66r"] Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.082565 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-jm7hm\" (UID: \"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.082795 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69276924-096b-4e93-9397-08095f966062-metrics-certs\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.082902 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zw68\" (UniqueName: \"kubernetes.io/projected/6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068-kube-api-access-5zw68\") pod \"frr-k8s-webhook-server-7784b6fcf-jm7hm\" (UID: \"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.083076 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/69276924-096b-4e93-9397-08095f966062-frr-startup\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.083445 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-frr-conf\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.083483 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-frr-sockets\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.083519 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-reloader\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.083553 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-metrics\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.083602 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddld\" (UniqueName: \"kubernetes.io/projected/69276924-096b-4e93-9397-08095f966062-kube-api-access-tddld\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184512 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2620ac46-bf4c-4672-aee2-17d87685b2b9-metrics-certs\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184596 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69276924-096b-4e93-9397-08095f966062-metrics-certs\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184620 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-memberlist\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184639 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zw68\" (UniqueName: \"kubernetes.io/projected/6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068-kube-api-access-5zw68\") pod \"frr-k8s-webhook-server-7784b6fcf-jm7hm\" (UID: \"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184669 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2620ac46-bf4c-4672-aee2-17d87685b2b9-cert\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184689 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/69276924-096b-4e93-9397-08095f966062-frr-startup\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184703 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2r27\" (UniqueName: \"kubernetes.io/projected/1b2d484a-d9e3-4272-a080-a0439423997a-kube-api-access-f2r27\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184723 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-frr-conf\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184738 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-frr-sockets\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184758 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-metrics-certs\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184774 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-reloader\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184797 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs9tt\" (UniqueName: \"kubernetes.io/projected/2620ac46-bf4c-4672-aee2-17d87685b2b9-kube-api-access-vs9tt\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184811 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b2d484a-d9e3-4272-a080-a0439423997a-metallb-excludel2\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184827 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-metrics\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184854 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddld\" (UniqueName: \"kubernetes.io/projected/69276924-096b-4e93-9397-08095f966062-kube-api-access-tddld\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.184871 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-jm7hm\" (UID: \"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.185513 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-metrics\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.185714 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-frr-conf\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.185773 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-frr-sockets\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.186005 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/69276924-096b-4e93-9397-08095f966062-reloader\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.186985 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/69276924-096b-4e93-9397-08095f966062-frr-startup\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.190551 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69276924-096b-4e93-9397-08095f966062-metrics-certs\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.190834 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-jm7hm\" (UID: \"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.202932 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zw68\" (UniqueName: \"kubernetes.io/projected/6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068-kube-api-access-5zw68\") pod \"frr-k8s-webhook-server-7784b6fcf-jm7hm\" (UID: \"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.210921 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddld\" (UniqueName: \"kubernetes.io/projected/69276924-096b-4e93-9397-08095f966062-kube-api-access-tddld\") pod \"frr-k8s-sdgmf\" (UID: \"69276924-096b-4e93-9397-08095f966062\") " pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.271386 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.282607 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.285962 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2620ac46-bf4c-4672-aee2-17d87685b2b9-metrics-certs\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.286056 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-memberlist\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.286131 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2620ac46-bf4c-4672-aee2-17d87685b2b9-cert\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.286171 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r27\" (UniqueName: \"kubernetes.io/projected/1b2d484a-d9e3-4272-a080-a0439423997a-kube-api-access-f2r27\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.286226 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-metrics-certs\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: E1216 07:12:37.286253 4823 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.286274 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs9tt\" (UniqueName: \"kubernetes.io/projected/2620ac46-bf4c-4672-aee2-17d87685b2b9-kube-api-access-vs9tt\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: E1216 07:12:37.286356 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-memberlist podName:1b2d484a-d9e3-4272-a080-a0439423997a nodeName:}" failed. No retries permitted until 2025-12-16 07:12:37.786300522 +0000 UTC m=+1036.274866645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-memberlist") pod "speaker-dftnv" (UID: "1b2d484a-d9e3-4272-a080-a0439423997a") : secret "metallb-memberlist" not found Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.286385 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b2d484a-d9e3-4272-a080-a0439423997a-metallb-excludel2\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.287735 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b2d484a-d9e3-4272-a080-a0439423997a-metallb-excludel2\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.290770 4823 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.290891 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2620ac46-bf4c-4672-aee2-17d87685b2b9-metrics-certs\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.295791 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-metrics-certs\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.301751 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2620ac46-bf4c-4672-aee2-17d87685b2b9-cert\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.305593 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs9tt\" (UniqueName: \"kubernetes.io/projected/2620ac46-bf4c-4672-aee2-17d87685b2b9-kube-api-access-vs9tt\") pod \"controller-5bddd4b946-cn66r\" (UID: \"2620ac46-bf4c-4672-aee2-17d87685b2b9\") " pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.308628 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2r27\" (UniqueName: \"kubernetes.io/projected/1b2d484a-d9e3-4272-a080-a0439423997a-kube-api-access-f2r27\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.370980 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.568352 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-cn66r"] Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.701548 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm"] Dec 16 07:12:37 crc kubenswrapper[4823]: W1216 07:12:37.715836 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9ee161_6a5b_47a4_b15e_d3f9d1d7a068.slice/crio-1f88f16ed431bd6d6e23b1248155214ac2b2468eb624fadffdfc4ef3dddac988 WatchSource:0}: Error finding container 1f88f16ed431bd6d6e23b1248155214ac2b2468eb624fadffdfc4ef3dddac988: Status 404 returned error can't find the container with id 1f88f16ed431bd6d6e23b1248155214ac2b2468eb624fadffdfc4ef3dddac988 Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.804350 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-memberlist\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.808964 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b2d484a-d9e3-4272-a080-a0439423997a-memberlist\") pod \"speaker-dftnv\" (UID: \"1b2d484a-d9e3-4272-a080-a0439423997a\") " pod="metallb-system/speaker-dftnv" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.917603 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" event={"ID":"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068","Type":"ContainerStarted","Data":"1f88f16ed431bd6d6e23b1248155214ac2b2468eb624fadffdfc4ef3dddac988"} Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.919637 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-cn66r" event={"ID":"2620ac46-bf4c-4672-aee2-17d87685b2b9","Type":"ContainerStarted","Data":"a2bdda1c80ad2250d7e737bab308797aa3e1a03e9c096c8185029cf716593773"} Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.919679 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-cn66r" event={"ID":"2620ac46-bf4c-4672-aee2-17d87685b2b9","Type":"ContainerStarted","Data":"47a87dd08cbe92813cc15127e11a80266f896e9265a28ffd964293e85c1a85de"} Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.919691 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-cn66r" event={"ID":"2620ac46-bf4c-4672-aee2-17d87685b2b9","Type":"ContainerStarted","Data":"699da4d8e286f966e5a2a84d2386fd5fa69a8823534598d70d9cb32ab2188393"} Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.919738 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.920775 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"ace08bfa0a110ce6ee17446f8ae33a421d20614d05d72491677469449c302b93"} Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.937389 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-cn66r" podStartSLOduration=0.937368744 podStartE2EDuration="937.368744ms" podCreationTimestamp="2025-12-16 07:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:37.935904798 +0000 UTC m=+1036.424470931" watchObservedRunningTime="2025-12-16 07:12:37.937368744 +0000 UTC m=+1036.425934867" Dec 16 07:12:37 crc kubenswrapper[4823]: I1216 07:12:37.982429 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dftnv" Dec 16 07:12:38 crc kubenswrapper[4823]: I1216 07:12:38.930386 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dftnv" event={"ID":"1b2d484a-d9e3-4272-a080-a0439423997a","Type":"ContainerStarted","Data":"cad7504d65275e5214c9e15b1ab2669faac9b312103fb93958ac694d472b303f"} Dec 16 07:12:38 crc kubenswrapper[4823]: I1216 07:12:38.930749 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dftnv" event={"ID":"1b2d484a-d9e3-4272-a080-a0439423997a","Type":"ContainerStarted","Data":"6d1930657279acda29e3bf450266b52ba6e000a6a73c21f06e158b2d5f2fa296"} Dec 16 07:12:38 crc kubenswrapper[4823]: I1216 07:12:38.930761 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dftnv" event={"ID":"1b2d484a-d9e3-4272-a080-a0439423997a","Type":"ContainerStarted","Data":"422af6b487b8310a1ba0bab0d3f7f1a0a85878aa16e872a88af07a06af9accb3"} Dec 16 07:12:38 crc kubenswrapper[4823]: I1216 07:12:38.930905 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dftnv" Dec 16 07:12:38 crc kubenswrapper[4823]: I1216 07:12:38.956593 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dftnv" podStartSLOduration=1.9565778759999999 podStartE2EDuration="1.956577876s" podCreationTimestamp="2025-12-16 07:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:38.955872954 +0000 UTC m=+1037.444439077" watchObservedRunningTime="2025-12-16 07:12:38.956577876 +0000 UTC m=+1037.445143999" Dec 16 07:12:44 crc kubenswrapper[4823]: I1216 07:12:44.996801 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" event={"ID":"6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068","Type":"ContainerStarted","Data":"0b88f1fa326977d8315d3a76278d223bedd144af6dff7bf520bfb0af0f29b74b"} Dec 16 07:12:44 crc kubenswrapper[4823]: I1216 07:12:44.997445 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:44 crc kubenswrapper[4823]: I1216 07:12:44.999098 4823 generic.go:334] "Generic (PLEG): container finished" podID="69276924-096b-4e93-9397-08095f966062" containerID="d14420f9685e6cdc7ea590284944dc15a2779fe2b3fb09c5e442086b4e11982c" exitCode=0 Dec 16 07:12:44 crc kubenswrapper[4823]: I1216 07:12:44.999147 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerDied","Data":"d14420f9685e6cdc7ea590284944dc15a2779fe2b3fb09c5e442086b4e11982c"} Dec 16 07:12:45 crc kubenswrapper[4823]: I1216 07:12:45.071450 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" podStartSLOduration=2.660269084 podStartE2EDuration="9.071426906s" podCreationTimestamp="2025-12-16 07:12:36 +0000 UTC" firstStartedPulling="2025-12-16 07:12:37.718444194 +0000 UTC m=+1036.207010317" lastFinishedPulling="2025-12-16 07:12:44.129602006 +0000 UTC m=+1042.618168139" observedRunningTime="2025-12-16 07:12:45.018158822 +0000 UTC m=+1043.506724945" watchObservedRunningTime="2025-12-16 07:12:45.071426906 +0000 UTC m=+1043.559993029" Dec 16 07:12:46 crc kubenswrapper[4823]: I1216 07:12:46.006605 4823 generic.go:334] "Generic (PLEG): container finished" podID="69276924-096b-4e93-9397-08095f966062" containerID="ea44392b0d0df2237678786d3a7be3e794134a8cf23a971849ef3c9267e1008d" exitCode=0 Dec 16 07:12:46 crc kubenswrapper[4823]: I1216 07:12:46.006860 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerDied","Data":"ea44392b0d0df2237678786d3a7be3e794134a8cf23a971849ef3c9267e1008d"} Dec 16 07:12:47 crc kubenswrapper[4823]: I1216 07:12:47.016721 4823 generic.go:334] "Generic (PLEG): container finished" podID="69276924-096b-4e93-9397-08095f966062" containerID="7f389950c8d1bdc3301ed5250f1d3e6b117f93861788a7e2af7fedfe37dffaab" exitCode=0 Dec 16 07:12:47 crc kubenswrapper[4823]: I1216 07:12:47.016789 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerDied","Data":"7f389950c8d1bdc3301ed5250f1d3e6b117f93861788a7e2af7fedfe37dffaab"} Dec 16 07:12:47 crc kubenswrapper[4823]: I1216 07:12:47.377624 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-cn66r" Dec 16 07:12:48 crc kubenswrapper[4823]: I1216 07:12:48.026236 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"b637a5a609b45ec331113152656d5946a5c205a1f25b3cb62af0cf08273abd4d"} Dec 16 07:12:48 crc kubenswrapper[4823]: I1216 07:12:48.026589 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"6c27b11ad5c6bd06b7322f9249d7bac2588674c0ccc8b5779c41ebce6dadfcf9"} Dec 16 07:12:48 crc kubenswrapper[4823]: I1216 07:12:48.026607 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"1ff5659c1b518cbb58ec0198c096bbd14024b511c26e7c0700b76d7eaf4ece6b"} Dec 16 07:12:48 crc kubenswrapper[4823]: I1216 07:12:48.026619 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"8b27f3815db189febfe41aff04c2346c5280af647ca331a43ef56540eb3f7ce5"} Dec 16 07:12:49 crc kubenswrapper[4823]: I1216 07:12:49.039218 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"d07ba9ba840dcf2d82076e986d09232ffb3e3bf3921b5ee2d864a06b3a1788af"} Dec 16 07:12:49 crc kubenswrapper[4823]: I1216 07:12:49.039263 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sdgmf" event={"ID":"69276924-096b-4e93-9397-08095f966062","Type":"ContainerStarted","Data":"85ab299eb08d6c24444644abd72cd1851f2368f34d07784607d752921d38add5"} Dec 16 07:12:49 crc kubenswrapper[4823]: I1216 07:12:49.040256 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:49 crc kubenswrapper[4823]: I1216 07:12:49.081343 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-sdgmf" podStartSLOduration=6.414284577 podStartE2EDuration="13.081311201s" podCreationTimestamp="2025-12-16 07:12:36 +0000 UTC" firstStartedPulling="2025-12-16 07:12:37.443347508 +0000 UTC m=+1035.931913631" lastFinishedPulling="2025-12-16 07:12:44.110374132 +0000 UTC m=+1042.598940255" observedRunningTime="2025-12-16 07:12:49.071866533 +0000 UTC m=+1047.560468857" watchObservedRunningTime="2025-12-16 07:12:49.081311201 +0000 UTC m=+1047.569877364" Dec 16 07:12:52 crc kubenswrapper[4823]: I1216 07:12:52.271588 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:52 crc kubenswrapper[4823]: I1216 07:12:52.330498 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:57 crc kubenswrapper[4823]: I1216 07:12:57.274484 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-sdgmf" Dec 16 07:12:57 crc kubenswrapper[4823]: I1216 07:12:57.293587 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-jm7hm" Dec 16 07:12:57 crc kubenswrapper[4823]: I1216 07:12:57.986344 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dftnv" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.731282 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6"] Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.733012 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.735622 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.752538 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6"] Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.792918 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.792979 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.793306 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skrtw\" (UniqueName: \"kubernetes.io/projected/168f9fd9-a3ba-4664-8d70-2b46c6c66071-kube-api-access-skrtw\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.894813 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.894913 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skrtw\" (UniqueName: \"kubernetes.io/projected/168f9fd9-a3ba-4664-8d70-2b46c6c66071-kube-api-access-skrtw\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.895017 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.895547 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.895650 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:12:59 crc kubenswrapper[4823]: I1216 07:12:59.910825 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skrtw\" (UniqueName: \"kubernetes.io/projected/168f9fd9-a3ba-4664-8d70-2b46c6c66071-kube-api-access-skrtw\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:13:00 crc kubenswrapper[4823]: I1216 07:13:00.050373 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:13:00 crc kubenswrapper[4823]: I1216 07:13:00.260201 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6"] Dec 16 07:13:00 crc kubenswrapper[4823]: W1216 07:13:00.263888 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168f9fd9_a3ba_4664_8d70_2b46c6c66071.slice/crio-99855e065cbf28a3f201bc72183147b58f779bbafa2cae2c2e013e812ae328eb WatchSource:0}: Error finding container 99855e065cbf28a3f201bc72183147b58f779bbafa2cae2c2e013e812ae328eb: Status 404 returned error can't find the container with id 99855e065cbf28a3f201bc72183147b58f779bbafa2cae2c2e013e812ae328eb Dec 16 07:13:01 crc kubenswrapper[4823]: I1216 07:13:01.111347 4823 generic.go:334] "Generic (PLEG): container finished" podID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerID="4f0d26e8d135d926f436690677a3618171d63b50c132687e266e539ca675af7e" exitCode=0 Dec 16 07:13:01 crc kubenswrapper[4823]: I1216 07:13:01.111394 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" event={"ID":"168f9fd9-a3ba-4664-8d70-2b46c6c66071","Type":"ContainerDied","Data":"4f0d26e8d135d926f436690677a3618171d63b50c132687e266e539ca675af7e"} Dec 16 07:13:01 crc kubenswrapper[4823]: I1216 07:13:01.111418 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" event={"ID":"168f9fd9-a3ba-4664-8d70-2b46c6c66071","Type":"ContainerStarted","Data":"99855e065cbf28a3f201bc72183147b58f779bbafa2cae2c2e013e812ae328eb"} Dec 16 07:13:04 crc kubenswrapper[4823]: I1216 07:13:04.145585 4823 generic.go:334] "Generic (PLEG): container finished" podID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerID="1fede90337e4b2c3dc887db793f49c2b09c7e3af0324a3b1b9f3bbb045069605" exitCode=0 Dec 16 07:13:04 crc kubenswrapper[4823]: I1216 07:13:04.145674 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" event={"ID":"168f9fd9-a3ba-4664-8d70-2b46c6c66071","Type":"ContainerDied","Data":"1fede90337e4b2c3dc887db793f49c2b09c7e3af0324a3b1b9f3bbb045069605"} Dec 16 07:13:05 crc kubenswrapper[4823]: I1216 07:13:05.157049 4823 generic.go:334] "Generic (PLEG): container finished" podID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerID="705e1c1d77320ccb7f85ad3c2562ef93e362ba88f127aca6ea03edb31ff2c228" exitCode=0 Dec 16 07:13:05 crc kubenswrapper[4823]: I1216 07:13:05.157097 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" event={"ID":"168f9fd9-a3ba-4664-8d70-2b46c6c66071","Type":"ContainerDied","Data":"705e1c1d77320ccb7f85ad3c2562ef93e362ba88f127aca6ea03edb31ff2c228"} Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.420451 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.497537 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-bundle\") pod \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.497612 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skrtw\" (UniqueName: \"kubernetes.io/projected/168f9fd9-a3ba-4664-8d70-2b46c6c66071-kube-api-access-skrtw\") pod \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.497696 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-util\") pod \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\" (UID: \"168f9fd9-a3ba-4664-8d70-2b46c6c66071\") " Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.498775 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-bundle" (OuterVolumeSpecName: "bundle") pod "168f9fd9-a3ba-4664-8d70-2b46c6c66071" (UID: "168f9fd9-a3ba-4664-8d70-2b46c6c66071"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.502915 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168f9fd9-a3ba-4664-8d70-2b46c6c66071-kube-api-access-skrtw" (OuterVolumeSpecName: "kube-api-access-skrtw") pod "168f9fd9-a3ba-4664-8d70-2b46c6c66071" (UID: "168f9fd9-a3ba-4664-8d70-2b46c6c66071"). InnerVolumeSpecName "kube-api-access-skrtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.512550 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-util" (OuterVolumeSpecName: "util") pod "168f9fd9-a3ba-4664-8d70-2b46c6c66071" (UID: "168f9fd9-a3ba-4664-8d70-2b46c6c66071"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.598799 4823 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.598848 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skrtw\" (UniqueName: \"kubernetes.io/projected/168f9fd9-a3ba-4664-8d70-2b46c6c66071-kube-api-access-skrtw\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4823]: I1216 07:13:06.598860 4823 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/168f9fd9-a3ba-4664-8d70-2b46c6c66071-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:07 crc kubenswrapper[4823]: I1216 07:13:07.170080 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" event={"ID":"168f9fd9-a3ba-4664-8d70-2b46c6c66071","Type":"ContainerDied","Data":"99855e065cbf28a3f201bc72183147b58f779bbafa2cae2c2e013e812ae328eb"} Dec 16 07:13:07 crc kubenswrapper[4823]: I1216 07:13:07.170118 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99855e065cbf28a3f201bc72183147b58f779bbafa2cae2c2e013e812ae328eb" Dec 16 07:13:07 crc kubenswrapper[4823]: I1216 07:13:07.170161 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.057440 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr"] Dec 16 07:13:10 crc kubenswrapper[4823]: E1216 07:13:10.057934 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="util" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.057947 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="util" Dec 16 07:13:10 crc kubenswrapper[4823]: E1216 07:13:10.057959 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="extract" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.057965 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="extract" Dec 16 07:13:10 crc kubenswrapper[4823]: E1216 07:13:10.057981 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="pull" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.057987 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="pull" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.058096 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="168f9fd9-a3ba-4664-8d70-2b46c6c66071" containerName="extract" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.058483 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.061217 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.061372 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.071859 4823 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-92fb8" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.076036 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr"] Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.142397 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/898756ec-c5a5-462f-95e2-8c3897718314-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-csldr\" (UID: \"898756ec-c5a5-462f-95e2-8c3897718314\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.142500 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hplx\" (UniqueName: \"kubernetes.io/projected/898756ec-c5a5-462f-95e2-8c3897718314-kube-api-access-7hplx\") pod \"cert-manager-operator-controller-manager-64cf6dff88-csldr\" (UID: \"898756ec-c5a5-462f-95e2-8c3897718314\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.243220 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hplx\" (UniqueName: \"kubernetes.io/projected/898756ec-c5a5-462f-95e2-8c3897718314-kube-api-access-7hplx\") pod \"cert-manager-operator-controller-manager-64cf6dff88-csldr\" (UID: \"898756ec-c5a5-462f-95e2-8c3897718314\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.243319 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/898756ec-c5a5-462f-95e2-8c3897718314-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-csldr\" (UID: \"898756ec-c5a5-462f-95e2-8c3897718314\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.243856 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/898756ec-c5a5-462f-95e2-8c3897718314-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-csldr\" (UID: \"898756ec-c5a5-462f-95e2-8c3897718314\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.270807 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hplx\" (UniqueName: \"kubernetes.io/projected/898756ec-c5a5-462f-95e2-8c3897718314-kube-api-access-7hplx\") pod \"cert-manager-operator-controller-manager-64cf6dff88-csldr\" (UID: \"898756ec-c5a5-462f-95e2-8c3897718314\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.372074 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" Dec 16 07:13:10 crc kubenswrapper[4823]: I1216 07:13:10.858406 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr"] Dec 16 07:13:10 crc kubenswrapper[4823]: W1216 07:13:10.867270 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod898756ec_c5a5_462f_95e2_8c3897718314.slice/crio-02fdc53676400a8e9331f1faff3a72b5d2f76b3fbdec9ebe727e7013859bee2b WatchSource:0}: Error finding container 02fdc53676400a8e9331f1faff3a72b5d2f76b3fbdec9ebe727e7013859bee2b: Status 404 returned error can't find the container with id 02fdc53676400a8e9331f1faff3a72b5d2f76b3fbdec9ebe727e7013859bee2b Dec 16 07:13:11 crc kubenswrapper[4823]: I1216 07:13:11.193303 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" event={"ID":"898756ec-c5a5-462f-95e2-8c3897718314","Type":"ContainerStarted","Data":"02fdc53676400a8e9331f1faff3a72b5d2f76b3fbdec9ebe727e7013859bee2b"} Dec 16 07:13:18 crc kubenswrapper[4823]: I1216 07:13:18.254999 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" event={"ID":"898756ec-c5a5-462f-95e2-8c3897718314","Type":"ContainerStarted","Data":"0f3b088406f15edab38d1637983bdcb2668bf6e61547b496f8c685f5a39817eb"} Dec 16 07:13:18 crc kubenswrapper[4823]: I1216 07:13:18.282950 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-csldr" podStartSLOduration=1.492220406 podStartE2EDuration="8.282929698s" podCreationTimestamp="2025-12-16 07:13:10 +0000 UTC" firstStartedPulling="2025-12-16 07:13:10.869194446 +0000 UTC m=+1069.357760569" lastFinishedPulling="2025-12-16 07:13:17.659903738 +0000 UTC m=+1076.148469861" observedRunningTime="2025-12-16 07:13:18.27980126 +0000 UTC m=+1076.768367393" watchObservedRunningTime="2025-12-16 07:13:18.282929698 +0000 UTC m=+1076.771495841" Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.904889 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-ss7qt"] Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.906666 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.910038 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.910116 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.910551 4823 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-b7bkc" Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.918738 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-ss7qt"] Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.993544 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsnff\" (UniqueName: \"kubernetes.io/projected/35cdac9f-df23-48a6-93e5-83ff4cca639e-kube-api-access-zsnff\") pod \"cert-manager-webhook-f4fb5df64-ss7qt\" (UID: \"35cdac9f-df23-48a6-93e5-83ff4cca639e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:21 crc kubenswrapper[4823]: I1216 07:13:21.993626 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35cdac9f-df23-48a6-93e5-83ff4cca639e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-ss7qt\" (UID: \"35cdac9f-df23-48a6-93e5-83ff4cca639e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.095236 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsnff\" (UniqueName: \"kubernetes.io/projected/35cdac9f-df23-48a6-93e5-83ff4cca639e-kube-api-access-zsnff\") pod \"cert-manager-webhook-f4fb5df64-ss7qt\" (UID: \"35cdac9f-df23-48a6-93e5-83ff4cca639e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.095305 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35cdac9f-df23-48a6-93e5-83ff4cca639e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-ss7qt\" (UID: \"35cdac9f-df23-48a6-93e5-83ff4cca639e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.117421 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35cdac9f-df23-48a6-93e5-83ff4cca639e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-ss7qt\" (UID: \"35cdac9f-df23-48a6-93e5-83ff4cca639e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.120444 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsnff\" (UniqueName: \"kubernetes.io/projected/35cdac9f-df23-48a6-93e5-83ff4cca639e-kube-api-access-zsnff\") pod \"cert-manager-webhook-f4fb5df64-ss7qt\" (UID: \"35cdac9f-df23-48a6-93e5-83ff4cca639e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.227243 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.675327 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-ss7qt"] Dec 16 07:13:22 crc kubenswrapper[4823]: W1216 07:13:22.687777 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35cdac9f_df23_48a6_93e5_83ff4cca639e.slice/crio-33dbe74840d417f252ea2c07440e21ab3a0fdb524d8d69d4911b72bda4a12297 WatchSource:0}: Error finding container 33dbe74840d417f252ea2c07440e21ab3a0fdb524d8d69d4911b72bda4a12297: Status 404 returned error can't find the container with id 33dbe74840d417f252ea2c07440e21ab3a0fdb524d8d69d4911b72bda4a12297 Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.862998 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-slqx8"] Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.863893 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.867154 4823 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-kdqrx" Dec 16 07:13:22 crc kubenswrapper[4823]: I1216 07:13:22.872329 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-slqx8"] Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.004752 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-slqx8\" (UID: \"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.005067 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db974\" (UniqueName: \"kubernetes.io/projected/080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59-kube-api-access-db974\") pod \"cert-manager-cainjector-855d9ccff4-slqx8\" (UID: \"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.106264 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db974\" (UniqueName: \"kubernetes.io/projected/080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59-kube-api-access-db974\") pod \"cert-manager-cainjector-855d9ccff4-slqx8\" (UID: \"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.106329 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-slqx8\" (UID: \"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.125325 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db974\" (UniqueName: \"kubernetes.io/projected/080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59-kube-api-access-db974\") pod \"cert-manager-cainjector-855d9ccff4-slqx8\" (UID: \"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.126624 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-slqx8\" (UID: \"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.180626 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.289699 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" event={"ID":"35cdac9f-df23-48a6-93e5-83ff4cca639e","Type":"ContainerStarted","Data":"33dbe74840d417f252ea2c07440e21ab3a0fdb524d8d69d4911b72bda4a12297"} Dec 16 07:13:23 crc kubenswrapper[4823]: I1216 07:13:23.582598 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-slqx8"] Dec 16 07:13:23 crc kubenswrapper[4823]: W1216 07:13:23.595550 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod080d4cb1_f1c1_4fe3_ab3b_2c2e621d5a59.slice/crio-d6edeabad51af48185ad9e2ee579eab944a0b5b0568ae3a9e33eb55f1bb69b28 WatchSource:0}: Error finding container d6edeabad51af48185ad9e2ee579eab944a0b5b0568ae3a9e33eb55f1bb69b28: Status 404 returned error can't find the container with id d6edeabad51af48185ad9e2ee579eab944a0b5b0568ae3a9e33eb55f1bb69b28 Dec 16 07:13:24 crc kubenswrapper[4823]: I1216 07:13:24.299950 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" event={"ID":"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59","Type":"ContainerStarted","Data":"d6edeabad51af48185ad9e2ee579eab944a0b5b0568ae3a9e33eb55f1bb69b28"} Dec 16 07:13:28 crc kubenswrapper[4823]: I1216 07:13:28.133526 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:13:28 crc kubenswrapper[4823]: I1216 07:13:28.133851 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:13:30 crc kubenswrapper[4823]: I1216 07:13:30.338385 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" event={"ID":"35cdac9f-df23-48a6-93e5-83ff4cca639e","Type":"ContainerStarted","Data":"45d03fb77d06576273c5c0aa0a81bcbc27fe63ad728d9765d52a94ee0c35409e"} Dec 16 07:13:30 crc kubenswrapper[4823]: I1216 07:13:30.338852 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:30 crc kubenswrapper[4823]: I1216 07:13:30.340511 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" event={"ID":"080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59","Type":"ContainerStarted","Data":"db1afbef3349fe7171f20d8386924923d5a7eb9aae8eb6b13b147141123ae485"} Dec 16 07:13:30 crc kubenswrapper[4823]: I1216 07:13:30.362813 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" podStartSLOduration=2.385496053 podStartE2EDuration="9.362789269s" podCreationTimestamp="2025-12-16 07:13:21 +0000 UTC" firstStartedPulling="2025-12-16 07:13:22.691674928 +0000 UTC m=+1081.180241051" lastFinishedPulling="2025-12-16 07:13:29.668968154 +0000 UTC m=+1088.157534267" observedRunningTime="2025-12-16 07:13:30.355363686 +0000 UTC m=+1088.843929819" watchObservedRunningTime="2025-12-16 07:13:30.362789269 +0000 UTC m=+1088.851355432" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.356607 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-slqx8" podStartSLOduration=6.270839103 podStartE2EDuration="12.356588948s" podCreationTimestamp="2025-12-16 07:13:22 +0000 UTC" firstStartedPulling="2025-12-16 07:13:23.59758589 +0000 UTC m=+1082.086152033" lastFinishedPulling="2025-12-16 07:13:29.683335755 +0000 UTC m=+1088.171901878" observedRunningTime="2025-12-16 07:13:30.375955573 +0000 UTC m=+1088.864521706" watchObservedRunningTime="2025-12-16 07:13:34.356588948 +0000 UTC m=+1092.845155071" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.360616 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-knsxz"] Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.361607 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.364870 4823 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2xq7c" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.383627 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-knsxz"] Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.475702 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rll5v\" (UniqueName: \"kubernetes.io/projected/33843cca-2433-4a8e-8835-46959d61e521-kube-api-access-rll5v\") pod \"cert-manager-86cb77c54b-knsxz\" (UID: \"33843cca-2433-4a8e-8835-46959d61e521\") " pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.475766 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33843cca-2433-4a8e-8835-46959d61e521-bound-sa-token\") pod \"cert-manager-86cb77c54b-knsxz\" (UID: \"33843cca-2433-4a8e-8835-46959d61e521\") " pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.576946 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rll5v\" (UniqueName: \"kubernetes.io/projected/33843cca-2433-4a8e-8835-46959d61e521-kube-api-access-rll5v\") pod \"cert-manager-86cb77c54b-knsxz\" (UID: \"33843cca-2433-4a8e-8835-46959d61e521\") " pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.576994 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33843cca-2433-4a8e-8835-46959d61e521-bound-sa-token\") pod \"cert-manager-86cb77c54b-knsxz\" (UID: \"33843cca-2433-4a8e-8835-46959d61e521\") " pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.595188 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33843cca-2433-4a8e-8835-46959d61e521-bound-sa-token\") pod \"cert-manager-86cb77c54b-knsxz\" (UID: \"33843cca-2433-4a8e-8835-46959d61e521\") " pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.595553 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rll5v\" (UniqueName: \"kubernetes.io/projected/33843cca-2433-4a8e-8835-46959d61e521-kube-api-access-rll5v\") pod \"cert-manager-86cb77c54b-knsxz\" (UID: \"33843cca-2433-4a8e-8835-46959d61e521\") " pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.678138 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-knsxz" Dec 16 07:13:34 crc kubenswrapper[4823]: I1216 07:13:34.868922 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-knsxz"] Dec 16 07:13:35 crc kubenswrapper[4823]: I1216 07:13:35.376715 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-knsxz" event={"ID":"33843cca-2433-4a8e-8835-46959d61e521","Type":"ContainerStarted","Data":"ddb1a1814e3db7490a5b78232e1d123e2e6c752b978e1cb10bff8c48b984fe14"} Dec 16 07:13:36 crc kubenswrapper[4823]: I1216 07:13:36.385769 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-knsxz" event={"ID":"33843cca-2433-4a8e-8835-46959d61e521","Type":"ContainerStarted","Data":"6a3dd112eacfb5a696f0587edda28d1f0f870acfc9ca6e9bfdd9bd67a1e3c87c"} Dec 16 07:13:36 crc kubenswrapper[4823]: I1216 07:13:36.402939 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-knsxz" podStartSLOduration=2.4029212810000002 podStartE2EDuration="2.402921281s" podCreationTimestamp="2025-12-16 07:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:36.399473212 +0000 UTC m=+1094.888039345" watchObservedRunningTime="2025-12-16 07:13:36.402921281 +0000 UTC m=+1094.891487414" Dec 16 07:13:37 crc kubenswrapper[4823]: I1216 07:13:37.230684 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-ss7qt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.435472 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5b7rt"] Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.436735 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.438533 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.439113 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.439344 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gcl8x" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.444592 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5b7rt"] Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.532634 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcpkz\" (UniqueName: \"kubernetes.io/projected/bae4e226-369c-445b-96d3-267079b07732-kube-api-access-bcpkz\") pod \"openstack-operator-index-5b7rt\" (UID: \"bae4e226-369c-445b-96d3-267079b07732\") " pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.634421 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcpkz\" (UniqueName: \"kubernetes.io/projected/bae4e226-369c-445b-96d3-267079b07732-kube-api-access-bcpkz\") pod \"openstack-operator-index-5b7rt\" (UID: \"bae4e226-369c-445b-96d3-267079b07732\") " pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.653941 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcpkz\" (UniqueName: \"kubernetes.io/projected/bae4e226-369c-445b-96d3-267079b07732-kube-api-access-bcpkz\") pod \"openstack-operator-index-5b7rt\" (UID: \"bae4e226-369c-445b-96d3-267079b07732\") " pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:46 crc kubenswrapper[4823]: I1216 07:13:46.756099 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:47 crc kubenswrapper[4823]: I1216 07:13:47.171719 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5b7rt"] Dec 16 07:13:47 crc kubenswrapper[4823]: I1216 07:13:47.474628 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5b7rt" event={"ID":"bae4e226-369c-445b-96d3-267079b07732","Type":"ContainerStarted","Data":"a0feb84201d2721430cbf52276bc95566b5bac0a4e85541213a394452e7ee0c0"} Dec 16 07:13:48 crc kubenswrapper[4823]: I1216 07:13:48.482632 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5b7rt" event={"ID":"bae4e226-369c-445b-96d3-267079b07732","Type":"ContainerStarted","Data":"3f366365171a2feae6ec0385dc4b1327fb6fe20efd16cf1e29fec3ec712526aa"} Dec 16 07:13:48 crc kubenswrapper[4823]: I1216 07:13:48.501716 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5b7rt" podStartSLOduration=1.577751375 podStartE2EDuration="2.501695661s" podCreationTimestamp="2025-12-16 07:13:46 +0000 UTC" firstStartedPulling="2025-12-16 07:13:47.175893001 +0000 UTC m=+1105.664459124" lastFinishedPulling="2025-12-16 07:13:48.099837287 +0000 UTC m=+1106.588403410" observedRunningTime="2025-12-16 07:13:48.498266313 +0000 UTC m=+1106.986832456" watchObservedRunningTime="2025-12-16 07:13:48.501695661 +0000 UTC m=+1106.990261784" Dec 16 07:13:56 crc kubenswrapper[4823]: I1216 07:13:56.757361 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:56 crc kubenswrapper[4823]: I1216 07:13:56.757842 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:56 crc kubenswrapper[4823]: I1216 07:13:56.786662 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:57 crc kubenswrapper[4823]: I1216 07:13:57.576697 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5b7rt" Dec 16 07:13:58 crc kubenswrapper[4823]: I1216 07:13:58.134182 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:13:58 crc kubenswrapper[4823]: I1216 07:13:58.134270 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.121685 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m"] Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.122998 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.124941 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-k6cjr" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.135901 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m"] Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.225361 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-bundle\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.225466 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-util\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.225519 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49mf5\" (UniqueName: \"kubernetes.io/projected/40af155f-2129-4d23-ab43-96419168bfc8-kube-api-access-49mf5\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.326597 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-bundle\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.326715 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-util\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.326770 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49mf5\" (UniqueName: \"kubernetes.io/projected/40af155f-2129-4d23-ab43-96419168bfc8-kube-api-access-49mf5\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.327399 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-util\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.327618 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-bundle\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.357992 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49mf5\" (UniqueName: \"kubernetes.io/projected/40af155f-2129-4d23-ab43-96419168bfc8-kube-api-access-49mf5\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.438529 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:05 crc kubenswrapper[4823]: I1216 07:14:05.626863 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m"] Dec 16 07:14:06 crc kubenswrapper[4823]: I1216 07:14:06.605512 4823 generic.go:334] "Generic (PLEG): container finished" podID="40af155f-2129-4d23-ab43-96419168bfc8" containerID="daba17b34dc843490771c5568e6e4c2f9dd69afb13b061a02cc48130f946606e" exitCode=0 Dec 16 07:14:06 crc kubenswrapper[4823]: I1216 07:14:06.605666 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" event={"ID":"40af155f-2129-4d23-ab43-96419168bfc8","Type":"ContainerDied","Data":"daba17b34dc843490771c5568e6e4c2f9dd69afb13b061a02cc48130f946606e"} Dec 16 07:14:06 crc kubenswrapper[4823]: I1216 07:14:06.605913 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" event={"ID":"40af155f-2129-4d23-ab43-96419168bfc8","Type":"ContainerStarted","Data":"e2439dc9cffa356c0abcebf4734c975989a691cb42b05fcb07845b23a8666566"} Dec 16 07:14:08 crc kubenswrapper[4823]: I1216 07:14:08.624468 4823 generic.go:334] "Generic (PLEG): container finished" podID="40af155f-2129-4d23-ab43-96419168bfc8" containerID="a55819043643f0a85bc9f4ff60fd814d0be71cee61f8c5461ea7cca050397e67" exitCode=0 Dec 16 07:14:08 crc kubenswrapper[4823]: I1216 07:14:08.624607 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" event={"ID":"40af155f-2129-4d23-ab43-96419168bfc8","Type":"ContainerDied","Data":"a55819043643f0a85bc9f4ff60fd814d0be71cee61f8c5461ea7cca050397e67"} Dec 16 07:14:09 crc kubenswrapper[4823]: I1216 07:14:09.636175 4823 generic.go:334] "Generic (PLEG): container finished" podID="40af155f-2129-4d23-ab43-96419168bfc8" containerID="12ca5d345fec1f179dffde21229de42b2680cbcc2558af178345e55da8b8aa90" exitCode=0 Dec 16 07:14:09 crc kubenswrapper[4823]: I1216 07:14:09.636234 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" event={"ID":"40af155f-2129-4d23-ab43-96419168bfc8","Type":"ContainerDied","Data":"12ca5d345fec1f179dffde21229de42b2680cbcc2558af178345e55da8b8aa90"} Dec 16 07:14:10 crc kubenswrapper[4823]: I1216 07:14:10.920762 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.113796 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49mf5\" (UniqueName: \"kubernetes.io/projected/40af155f-2129-4d23-ab43-96419168bfc8-kube-api-access-49mf5\") pod \"40af155f-2129-4d23-ab43-96419168bfc8\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.113859 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-bundle\") pod \"40af155f-2129-4d23-ab43-96419168bfc8\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.114006 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-util\") pod \"40af155f-2129-4d23-ab43-96419168bfc8\" (UID: \"40af155f-2129-4d23-ab43-96419168bfc8\") " Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.114697 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-bundle" (OuterVolumeSpecName: "bundle") pod "40af155f-2129-4d23-ab43-96419168bfc8" (UID: "40af155f-2129-4d23-ab43-96419168bfc8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.124529 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40af155f-2129-4d23-ab43-96419168bfc8-kube-api-access-49mf5" (OuterVolumeSpecName: "kube-api-access-49mf5") pod "40af155f-2129-4d23-ab43-96419168bfc8" (UID: "40af155f-2129-4d23-ab43-96419168bfc8"). InnerVolumeSpecName "kube-api-access-49mf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.141593 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-util" (OuterVolumeSpecName: "util") pod "40af155f-2129-4d23-ab43-96419168bfc8" (UID: "40af155f-2129-4d23-ab43-96419168bfc8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.215441 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49mf5\" (UniqueName: \"kubernetes.io/projected/40af155f-2129-4d23-ab43-96419168bfc8-kube-api-access-49mf5\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.215471 4823 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.215481 4823 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40af155f-2129-4d23-ab43-96419168bfc8-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.650367 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" event={"ID":"40af155f-2129-4d23-ab43-96419168bfc8","Type":"ContainerDied","Data":"e2439dc9cffa356c0abcebf4734c975989a691cb42b05fcb07845b23a8666566"} Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.650426 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2439dc9cffa356c0abcebf4734c975989a691cb42b05fcb07845b23a8666566" Dec 16 07:14:11 crc kubenswrapper[4823]: I1216 07:14:11.650503 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.176956 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2"] Dec 16 07:14:17 crc kubenswrapper[4823]: E1216 07:14:17.177542 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="pull" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.177559 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="pull" Dec 16 07:14:17 crc kubenswrapper[4823]: E1216 07:14:17.177582 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="util" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.177592 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="util" Dec 16 07:14:17 crc kubenswrapper[4823]: E1216 07:14:17.177611 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="extract" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.177620 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="extract" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.177757 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="40af155f-2129-4d23-ab43-96419168bfc8" containerName="extract" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.178306 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.180770 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-2t6dz" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.223071 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2"] Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.366510 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4p9v\" (UniqueName: \"kubernetes.io/projected/2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d-kube-api-access-z4p9v\") pod \"openstack-operator-controller-operator-69fc74c8bb-vqqz2\" (UID: \"2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d\") " pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.467445 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4p9v\" (UniqueName: \"kubernetes.io/projected/2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d-kube-api-access-z4p9v\") pod \"openstack-operator-controller-operator-69fc74c8bb-vqqz2\" (UID: \"2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d\") " pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.487128 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4p9v\" (UniqueName: \"kubernetes.io/projected/2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d-kube-api-access-z4p9v\") pod \"openstack-operator-controller-operator-69fc74c8bb-vqqz2\" (UID: \"2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d\") " pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.500971 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:17 crc kubenswrapper[4823]: I1216 07:14:17.913427 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2"] Dec 16 07:14:18 crc kubenswrapper[4823]: I1216 07:14:18.700345 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" event={"ID":"2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d","Type":"ContainerStarted","Data":"493e29b503aca36915f2f9e8c318126a183e3a59fcd2bfdddf6927e5359fe042"} Dec 16 07:14:23 crc kubenswrapper[4823]: I1216 07:14:23.734552 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" event={"ID":"2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d","Type":"ContainerStarted","Data":"9f6be6c413d37bfdb5f403a46193b00c724a5b352f1a79d36941a50e2a62bcae"} Dec 16 07:14:23 crc kubenswrapper[4823]: I1216 07:14:23.735151 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:23 crc kubenswrapper[4823]: I1216 07:14:23.796077 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" podStartSLOduration=2.122640505 podStartE2EDuration="6.796059518s" podCreationTimestamp="2025-12-16 07:14:17 +0000 UTC" firstStartedPulling="2025-12-16 07:14:17.920439399 +0000 UTC m=+1136.409005522" lastFinishedPulling="2025-12-16 07:14:22.593858412 +0000 UTC m=+1141.082424535" observedRunningTime="2025-12-16 07:14:23.789584575 +0000 UTC m=+1142.278150708" watchObservedRunningTime="2025-12-16 07:14:23.796059518 +0000 UTC m=+1142.284625641" Dec 16 07:14:27 crc kubenswrapper[4823]: I1216 07:14:27.503275 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-vqqz2" Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.134760 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.134877 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.134960 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.136069 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c07a7c4faebf9ec795cca9e8449add482643e386f41ece163e5f5944f0d37df3"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.136189 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://c07a7c4faebf9ec795cca9e8449add482643e386f41ece163e5f5944f0d37df3" gracePeriod=600 Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.763791 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="c07a7c4faebf9ec795cca9e8449add482643e386f41ece163e5f5944f0d37df3" exitCode=0 Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.763873 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"c07a7c4faebf9ec795cca9e8449add482643e386f41ece163e5f5944f0d37df3"} Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.764124 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"76342a6438b46c6d8e5101ee8ceb1df808db353230663e448e28ebb26272e882"} Dec 16 07:14:28 crc kubenswrapper[4823]: I1216 07:14:28.764148 4823 scope.go:117] "RemoveContainer" containerID="48219d3c0e584aed9d175a58b4a139883d9d4f8a627e33b1552f22d85e485c5c" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.127434 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92"] Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.129067 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.131069 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.131581 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.149935 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92"] Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.230049 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-config-volume\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.230109 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-secret-volume\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.230297 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t292p\" (UniqueName: \"kubernetes.io/projected/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-kube-api-access-t292p\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.330794 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t292p\" (UniqueName: \"kubernetes.io/projected/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-kube-api-access-t292p\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.330846 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-config-volume\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.330863 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-secret-volume\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.331754 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-config-volume\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.336933 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-secret-volume\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.358005 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t292p\" (UniqueName: \"kubernetes.io/projected/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-kube-api-access-t292p\") pod \"collect-profiles-29431155-4qg92\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.446422 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:00 crc kubenswrapper[4823]: I1216 07:15:00.984425 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.032493 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" event={"ID":"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6","Type":"ContainerStarted","Data":"485f1234eedde1243278aa9c55ed0c4cca31c00aee8f234f5af7c308c21a09d9"} Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.327457 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.328836 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.331612 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xjrnb" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.357920 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-l2h76"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.358869 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.362900 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qjg4f" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.396221 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-l2h76"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.406428 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.419996 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.421062 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.424439 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gm8kl" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.446915 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6k2q\" (UniqueName: \"kubernetes.io/projected/e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1-kube-api-access-q6k2q\") pod \"cinder-operator-controller-manager-5f98b4754f-l6tn9\" (UID: \"e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.453523 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.454570 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.456564 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-wft52" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.457916 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.469159 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.470007 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.475058 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.488083 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.493263 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.493964 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.497571 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wvwrg" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.500057 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wdqz2" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.502993 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.504308 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.514863 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9g47f" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.519641 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551736 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5nd\" (UniqueName: \"kubernetes.io/projected/62d59368-9ca6-4327-a979-c4c31903630c-kube-api-access-vn5nd\") pod \"glance-operator-controller-manager-767f9d7567-tg8ww\" (UID: \"62d59368-9ca6-4327-a979-c4c31903630c\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551796 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b9cd\" (UniqueName: \"kubernetes.io/projected/ae33bf2e-0415-4ba8-9508-f7c36182aec8-kube-api-access-5b9cd\") pod \"heat-operator-controller-manager-59b8dcb766-26qs6\" (UID: \"ae33bf2e-0415-4ba8-9508-f7c36182aec8\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551836 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551875 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6k2q\" (UniqueName: \"kubernetes.io/projected/e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1-kube-api-access-q6k2q\") pod \"cinder-operator-controller-manager-5f98b4754f-l6tn9\" (UID: \"e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551918 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqnt\" (UniqueName: \"kubernetes.io/projected/62b57d47-be40-449a-8503-b86187f19914-kube-api-access-xmqnt\") pod \"horizon-operator-controller-manager-6ccf486b9-8mh84\" (UID: \"62b57d47-be40-449a-8503-b86187f19914\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551944 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88l6p\" (UniqueName: \"kubernetes.io/projected/37b11baa-1136-4fea-869d-e3d8f98bca83-kube-api-access-88l6p\") pod \"designate-operator-controller-manager-66f8b87655-fvsjg\" (UID: \"37b11baa-1136-4fea-869d-e3d8f98bca83\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551970 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsd4\" (UniqueName: \"kubernetes.io/projected/e5260194-8fc8-4615-bfd5-98210220f074-kube-api-access-fwsd4\") pod \"barbican-operator-controller-manager-95949466-l2h76\" (UID: \"e5260194-8fc8-4615-bfd5-98210220f074\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.551997 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flwht\" (UniqueName: \"kubernetes.io/projected/7f6072b1-7137-4564-9000-aa50b569ceac-kube-api-access-flwht\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.560190 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.563088 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-629tj"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.564126 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.566688 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-sqq62" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.576133 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.605092 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.605942 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.613040 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jrwvs" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.614048 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6k2q\" (UniqueName: \"kubernetes.io/projected/e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1-kube-api-access-q6k2q\") pod \"cinder-operator-controller-manager-5f98b4754f-l6tn9\" (UID: \"e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.618402 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-629tj"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.644099 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.652204 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.653338 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655411 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88l6p\" (UniqueName: \"kubernetes.io/projected/37b11baa-1136-4fea-869d-e3d8f98bca83-kube-api-access-88l6p\") pod \"designate-operator-controller-manager-66f8b87655-fvsjg\" (UID: \"37b11baa-1136-4fea-869d-e3d8f98bca83\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655455 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsd4\" (UniqueName: \"kubernetes.io/projected/e5260194-8fc8-4615-bfd5-98210220f074-kube-api-access-fwsd4\") pod \"barbican-operator-controller-manager-95949466-l2h76\" (UID: \"e5260194-8fc8-4615-bfd5-98210220f074\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655608 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flwht\" (UniqueName: \"kubernetes.io/projected/7f6072b1-7137-4564-9000-aa50b569ceac-kube-api-access-flwht\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655653 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrvz\" (UniqueName: \"kubernetes.io/projected/862263d5-cd38-4867-a8ce-6a82d3170f48-kube-api-access-czrvz\") pod \"keystone-operator-controller-manager-5c7cbf548f-ffflh\" (UID: \"862263d5-cd38-4867-a8ce-6a82d3170f48\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655746 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5nd\" (UniqueName: \"kubernetes.io/projected/62d59368-9ca6-4327-a979-c4c31903630c-kube-api-access-vn5nd\") pod \"glance-operator-controller-manager-767f9d7567-tg8ww\" (UID: \"62d59368-9ca6-4327-a979-c4c31903630c\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655789 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b9cd\" (UniqueName: \"kubernetes.io/projected/ae33bf2e-0415-4ba8-9508-f7c36182aec8-kube-api-access-5b9cd\") pod \"heat-operator-controller-manager-59b8dcb766-26qs6\" (UID: \"ae33bf2e-0415-4ba8-9508-f7c36182aec8\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655838 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655883 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4h92\" (UniqueName: \"kubernetes.io/projected/8a46fccc-0870-4aed-96db-064958d3f0c3-kube-api-access-v4h92\") pod \"ironic-operator-controller-manager-f458558d7-629tj\" (UID: \"8a46fccc-0870-4aed-96db-064958d3f0c3\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.655926 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmqnt\" (UniqueName: \"kubernetes.io/projected/62b57d47-be40-449a-8503-b86187f19914-kube-api-access-xmqnt\") pod \"horizon-operator-controller-manager-6ccf486b9-8mh84\" (UID: \"62b57d47-be40-449a-8503-b86187f19914\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.659182 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tz65z" Dec 16 07:15:01 crc kubenswrapper[4823]: E1216 07:15:01.662163 4823 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:01 crc kubenswrapper[4823]: E1216 07:15:01.662225 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert podName:7f6072b1-7137-4564-9000-aa50b569ceac nodeName:}" failed. No retries permitted until 2025-12-16 07:15:02.162203682 +0000 UTC m=+1180.650769805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert") pod "infra-operator-controller-manager-84b495f78-8rx8h" (UID: "7f6072b1-7137-4564-9000-aa50b569ceac") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.685536 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.694820 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsd4\" (UniqueName: \"kubernetes.io/projected/e5260194-8fc8-4615-bfd5-98210220f074-kube-api-access-fwsd4\") pod \"barbican-operator-controller-manager-95949466-l2h76\" (UID: \"e5260194-8fc8-4615-bfd5-98210220f074\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.704842 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.705797 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.710755 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.711628 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.713582 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-n87mr" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.713938 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-b2szq" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.721726 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.721736 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.726568 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5nd\" (UniqueName: \"kubernetes.io/projected/62d59368-9ca6-4327-a979-c4c31903630c-kube-api-access-vn5nd\") pod \"glance-operator-controller-manager-767f9d7567-tg8ww\" (UID: \"62d59368-9ca6-4327-a979-c4c31903630c\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.727246 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flwht\" (UniqueName: \"kubernetes.io/projected/7f6072b1-7137-4564-9000-aa50b569ceac-kube-api-access-flwht\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.727929 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88l6p\" (UniqueName: \"kubernetes.io/projected/37b11baa-1136-4fea-869d-e3d8f98bca83-kube-api-access-88l6p\") pod \"designate-operator-controller-manager-66f8b87655-fvsjg\" (UID: \"37b11baa-1136-4fea-869d-e3d8f98bca83\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.731736 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b9cd\" (UniqueName: \"kubernetes.io/projected/ae33bf2e-0415-4ba8-9508-f7c36182aec8-kube-api-access-5b9cd\") pod \"heat-operator-controller-manager-59b8dcb766-26qs6\" (UID: \"ae33bf2e-0415-4ba8-9508-f7c36182aec8\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.753456 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.762879 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxhhc\" (UniqueName: \"kubernetes.io/projected/e75f878b-9fb0-429b-8d6a-b30b98c1dba5-kube-api-access-pxhhc\") pod \"manila-operator-controller-manager-5fdd9786f7-57b67\" (UID: \"e75f878b-9fb0-429b-8d6a-b30b98c1dba5\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.762931 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xfdz\" (UniqueName: \"kubernetes.io/projected/b3422264-49a2-4032-8906-b74358a9451d-kube-api-access-6xfdz\") pod \"neutron-operator-controller-manager-7cd87b778f-z7kmz\" (UID: \"b3422264-49a2-4032-8906-b74358a9451d\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.763043 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4h92\" (UniqueName: \"kubernetes.io/projected/8a46fccc-0870-4aed-96db-064958d3f0c3-kube-api-access-v4h92\") pod \"ironic-operator-controller-manager-f458558d7-629tj\" (UID: \"8a46fccc-0870-4aed-96db-064958d3f0c3\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.763111 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb59h\" (UniqueName: \"kubernetes.io/projected/9b601555-09ee-46de-8736-e28797436673-kube-api-access-jb59h\") pod \"mariadb-operator-controller-manager-f76f4954c-btpw8\" (UID: \"9b601555-09ee-46de-8736-e28797436673\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.763159 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrvz\" (UniqueName: \"kubernetes.io/projected/862263d5-cd38-4867-a8ce-6a82d3170f48-kube-api-access-czrvz\") pod \"keystone-operator-controller-manager-5c7cbf548f-ffflh\" (UID: \"862263d5-cd38-4867-a8ce-6a82d3170f48\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.763987 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.772744 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.775524 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kkzk2" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.784184 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.796683 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmqnt\" (UniqueName: \"kubernetes.io/projected/62b57d47-be40-449a-8503-b86187f19914-kube-api-access-xmqnt\") pod \"horizon-operator-controller-manager-6ccf486b9-8mh84\" (UID: \"62b57d47-be40-449a-8503-b86187f19914\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.800482 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.817912 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4h92\" (UniqueName: \"kubernetes.io/projected/8a46fccc-0870-4aed-96db-064958d3f0c3-kube-api-access-v4h92\") pod \"ironic-operator-controller-manager-f458558d7-629tj\" (UID: \"8a46fccc-0870-4aed-96db-064958d3f0c3\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.825134 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.825583 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.847360 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrvz\" (UniqueName: \"kubernetes.io/projected/862263d5-cd38-4867-a8ce-6a82d3170f48-kube-api-access-czrvz\") pod \"keystone-operator-controller-manager-5c7cbf548f-ffflh\" (UID: \"862263d5-cd38-4867-a8ce-6a82d3170f48\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.876087 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.890399 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb59h\" (UniqueName: \"kubernetes.io/projected/9b601555-09ee-46de-8736-e28797436673-kube-api-access-jb59h\") pod \"mariadb-operator-controller-manager-f76f4954c-btpw8\" (UID: \"9b601555-09ee-46de-8736-e28797436673\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.890558 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmtn\" (UniqueName: \"kubernetes.io/projected/201bb612-805a-4516-b18e-41382e5e4c42-kube-api-access-rcmtn\") pod \"nova-operator-controller-manager-5fbbf8b6cc-rkn7m\" (UID: \"201bb612-805a-4516-b18e-41382e5e4c42\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.890595 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxhhc\" (UniqueName: \"kubernetes.io/projected/e75f878b-9fb0-429b-8d6a-b30b98c1dba5-kube-api-access-pxhhc\") pod \"manila-operator-controller-manager-5fdd9786f7-57b67\" (UID: \"e75f878b-9fb0-429b-8d6a-b30b98c1dba5\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.890617 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xfdz\" (UniqueName: \"kubernetes.io/projected/b3422264-49a2-4032-8906-b74358a9451d-kube-api-access-6xfdz\") pod \"neutron-operator-controller-manager-7cd87b778f-z7kmz\" (UID: \"b3422264-49a2-4032-8906-b74358a9451d\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.892619 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.893524 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.908257 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.927981 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.933976 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx"] Dec 16 07:15:01 crc kubenswrapper[4823]: I1216 07:15:01.949638 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mt9jb" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.007260 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.013700 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8scmr\" (UniqueName: \"kubernetes.io/projected/9de24328-0da5-4d0a-a34c-5cd820b35a23-kube-api-access-8scmr\") pod \"octavia-operator-controller-manager-68c649d9d-8n9sx\" (UID: \"9de24328-0da5-4d0a-a34c-5cd820b35a23\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.059679 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmtn\" (UniqueName: \"kubernetes.io/projected/201bb612-805a-4516-b18e-41382e5e4c42-kube-api-access-rcmtn\") pod \"nova-operator-controller-manager-5fbbf8b6cc-rkn7m\" (UID: \"201bb612-805a-4516-b18e-41382e5e4c42\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.086794 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxhhc\" (UniqueName: \"kubernetes.io/projected/e75f878b-9fb0-429b-8d6a-b30b98c1dba5-kube-api-access-pxhhc\") pod \"manila-operator-controller-manager-5fdd9786f7-57b67\" (UID: \"e75f878b-9fb0-429b-8d6a-b30b98c1dba5\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.088809 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xfdz\" (UniqueName: \"kubernetes.io/projected/b3422264-49a2-4032-8906-b74358a9451d-kube-api-access-6xfdz\") pod \"neutron-operator-controller-manager-7cd87b778f-z7kmz\" (UID: \"b3422264-49a2-4032-8906-b74358a9451d\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.124406 4823 generic.go:334] "Generic (PLEG): container finished" podID="1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" containerID="d29286bc202b75173f2d2104bec333caae3792ccb51715cb7c891d3c31f531ec" exitCode=0 Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.124694 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" event={"ID":"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6","Type":"ContainerDied","Data":"d29286bc202b75173f2d2104bec333caae3792ccb51715cb7c891d3c31f531ec"} Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.125794 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb59h\" (UniqueName: \"kubernetes.io/projected/9b601555-09ee-46de-8736-e28797436673-kube-api-access-jb59h\") pod \"mariadb-operator-controller-manager-f76f4954c-btpw8\" (UID: \"9b601555-09ee-46de-8736-e28797436673\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.126365 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.129906 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.131609 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.132066 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmtn\" (UniqueName: \"kubernetes.io/projected/201bb612-805a-4516-b18e-41382e5e4c42-kube-api-access-rcmtn\") pod \"nova-operator-controller-manager-5fbbf8b6cc-rkn7m\" (UID: \"201bb612-805a-4516-b18e-41382e5e4c42\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.144208 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-c67p2" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.160494 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.162935 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8scmr\" (UniqueName: \"kubernetes.io/projected/9de24328-0da5-4d0a-a34c-5cd820b35a23-kube-api-access-8scmr\") pod \"octavia-operator-controller-manager-68c649d9d-8n9sx\" (UID: \"9de24328-0da5-4d0a-a34c-5cd820b35a23\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.162987 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.163377 4823 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.163415 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert podName:7f6072b1-7137-4564-9000-aa50b569ceac nodeName:}" failed. No retries permitted until 2025-12-16 07:15:03.16340158 +0000 UTC m=+1181.651967703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert") pod "infra-operator-controller-manager-84b495f78-8rx8h" (UID: "7f6072b1-7137-4564-9000-aa50b569ceac") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.163692 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-47v54"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.164930 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.172762 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.173737 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.177773 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xkt7x" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.182721 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8scmr\" (UniqueName: \"kubernetes.io/projected/9de24328-0da5-4d0a-a34c-5cd820b35a23-kube-api-access-8scmr\") pod \"octavia-operator-controller-manager-68c649d9d-8n9sx\" (UID: \"9de24328-0da5-4d0a-a34c-5cd820b35a23\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.182967 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.198637 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.199544 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.201508 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.206183 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.207431 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mm989" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.207652 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n4mp6" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.207774 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-47v54"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.212684 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.226371 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-dr886"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.227191 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.227587 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.230918 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.232181 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-77hvm" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.232624 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.236499 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-f82n6" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.241243 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-dr886"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.244941 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.245992 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.246816 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.250681 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l7ztn" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.253092 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.256465 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.269245 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpvn\" (UniqueName: \"kubernetes.io/projected/c69e81b5-99ea-4629-a61e-5d0e012bd472-kube-api-access-5bpvn\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.269299 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.269326 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnl2l\" (UniqueName: \"kubernetes.io/projected/73f9c317-3748-4e9d-a683-1d7fab3949b5-kube-api-access-lnl2l\") pod \"ovn-operator-controller-manager-bf6d4f946-7pmnl\" (UID: \"73f9c317-3748-4e9d-a683-1d7fab3949b5\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.269429 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8px\" (UniqueName: \"kubernetes.io/projected/f390c9c8-73bb-44e7-aa6f-4501691d8415-kube-api-access-gq8px\") pod \"placement-operator-controller-manager-8665b56d78-47v54\" (UID: \"f390c9c8-73bb-44e7-aa6f-4501691d8415\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.365493 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.376787 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.380600 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.386799 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr5tk\" (UniqueName: \"kubernetes.io/projected/f93451fb-312c-448f-a868-43c05626d74a-kube-api-access-jr5tk\") pod \"telemetry-operator-controller-manager-97d456b9-wrzfd\" (UID: \"f93451fb-312c-448f-a868-43c05626d74a\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.386985 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzl2\" (UniqueName: \"kubernetes.io/projected/3995bf1e-51ff-4543-ba13-cce941e6caab-kube-api-access-xmzl2\") pod \"swift-operator-controller-manager-5c6df8f9-f4294\" (UID: \"3995bf1e-51ff-4543-ba13-cce941e6caab\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387034 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8px\" (UniqueName: \"kubernetes.io/projected/f390c9c8-73bb-44e7-aa6f-4501691d8415-kube-api-access-gq8px\") pod \"placement-operator-controller-manager-8665b56d78-47v54\" (UID: \"f390c9c8-73bb-44e7-aa6f-4501691d8415\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387101 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb7cd\" (UniqueName: \"kubernetes.io/projected/ebfb35eb-0e15-454b-9f27-27c35373793b-kube-api-access-qb7cd\") pod \"watcher-operator-controller-manager-55f78b7c4c-sgp6f\" (UID: \"ebfb35eb-0e15-454b-9f27-27c35373793b\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387141 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpvn\" (UniqueName: \"kubernetes.io/projected/c69e81b5-99ea-4629-a61e-5d0e012bd472-kube-api-access-5bpvn\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387164 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsfbr\" (UniqueName: \"kubernetes.io/projected/22746af9-8023-44ba-8377-e35e048923fe-kube-api-access-gsfbr\") pod \"test-operator-controller-manager-756ccf86c7-dr886\" (UID: \"22746af9-8023-44ba-8377-e35e048923fe\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387182 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387200 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnl2l\" (UniqueName: \"kubernetes.io/projected/73f9c317-3748-4e9d-a683-1d7fab3949b5-kube-api-access-lnl2l\") pod \"ovn-operator-controller-manager-bf6d4f946-7pmnl\" (UID: \"73f9c317-3748-4e9d-a683-1d7fab3949b5\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.387368 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw"] Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.387808 4823 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.387852 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert podName:c69e81b5-99ea-4629-a61e-5d0e012bd472 nodeName:}" failed. No retries permitted until 2025-12-16 07:15:02.887840054 +0000 UTC m=+1181.376406177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" (UID: "c69e81b5-99ea-4629-a61e-5d0e012bd472") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.388180 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-z8mbd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.388322 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.395517 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.398507 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.398684 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.398859 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7ptf6" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.430484 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnl2l\" (UniqueName: \"kubernetes.io/projected/73f9c317-3748-4e9d-a683-1d7fab3949b5-kube-api-access-lnl2l\") pod \"ovn-operator-controller-manager-bf6d4f946-7pmnl\" (UID: \"73f9c317-3748-4e9d-a683-1d7fab3949b5\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.451427 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8px\" (UniqueName: \"kubernetes.io/projected/f390c9c8-73bb-44e7-aa6f-4501691d8415-kube-api-access-gq8px\") pod \"placement-operator-controller-manager-8665b56d78-47v54\" (UID: \"f390c9c8-73bb-44e7-aa6f-4501691d8415\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.453222 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpvn\" (UniqueName: \"kubernetes.io/projected/c69e81b5-99ea-4629-a61e-5d0e012bd472-kube-api-access-5bpvn\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.473602 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.484867 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488219 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb7cd\" (UniqueName: \"kubernetes.io/projected/ebfb35eb-0e15-454b-9f27-27c35373793b-kube-api-access-qb7cd\") pod \"watcher-operator-controller-manager-55f78b7c4c-sgp6f\" (UID: \"ebfb35eb-0e15-454b-9f27-27c35373793b\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488305 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsfbr\" (UniqueName: \"kubernetes.io/projected/22746af9-8023-44ba-8377-e35e048923fe-kube-api-access-gsfbr\") pod \"test-operator-controller-manager-756ccf86c7-dr886\" (UID: \"22746af9-8023-44ba-8377-e35e048923fe\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488356 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr5tk\" (UniqueName: \"kubernetes.io/projected/f93451fb-312c-448f-a868-43c05626d74a-kube-api-access-jr5tk\") pod \"telemetry-operator-controller-manager-97d456b9-wrzfd\" (UID: \"f93451fb-312c-448f-a868-43c05626d74a\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488419 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp42c\" (UniqueName: \"kubernetes.io/projected/8fbf843f-253e-46b0-944e-7e4055e7ecdb-kube-api-access-gp42c\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488454 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488481 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzl2\" (UniqueName: \"kubernetes.io/projected/3995bf1e-51ff-4543-ba13-cce941e6caab-kube-api-access-xmzl2\") pod \"swift-operator-controller-manager-5c6df8f9-f4294\" (UID: \"3995bf1e-51ff-4543-ba13-cce941e6caab\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488519 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.488567 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hss2f\" (UniqueName: \"kubernetes.io/projected/3378ca15-f3fb-410e-a3fe-96b21dfce8d8-kube-api-access-hss2f\") pod \"rabbitmq-cluster-operator-manager-668c99d594-txmdd\" (UID: \"3378ca15-f3fb-410e-a3fe-96b21dfce8d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.507095 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.515716 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzl2\" (UniqueName: \"kubernetes.io/projected/3995bf1e-51ff-4543-ba13-cce941e6caab-kube-api-access-xmzl2\") pod \"swift-operator-controller-manager-5c6df8f9-f4294\" (UID: \"3995bf1e-51ff-4543-ba13-cce941e6caab\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.527387 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb7cd\" (UniqueName: \"kubernetes.io/projected/ebfb35eb-0e15-454b-9f27-27c35373793b-kube-api-access-qb7cd\") pod \"watcher-operator-controller-manager-55f78b7c4c-sgp6f\" (UID: \"ebfb35eb-0e15-454b-9f27-27c35373793b\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.528147 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr5tk\" (UniqueName: \"kubernetes.io/projected/f93451fb-312c-448f-a868-43c05626d74a-kube-api-access-jr5tk\") pod \"telemetry-operator-controller-manager-97d456b9-wrzfd\" (UID: \"f93451fb-312c-448f-a868-43c05626d74a\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.553809 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.591563 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp42c\" (UniqueName: \"kubernetes.io/projected/8fbf843f-253e-46b0-944e-7e4055e7ecdb-kube-api-access-gp42c\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.591624 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.591664 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.591694 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hss2f\" (UniqueName: \"kubernetes.io/projected/3378ca15-f3fb-410e-a3fe-96b21dfce8d8-kube-api-access-hss2f\") pod \"rabbitmq-cluster-operator-manager-668c99d594-txmdd\" (UID: \"3378ca15-f3fb-410e-a3fe-96b21dfce8d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.591933 4823 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.592095 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:03.092046323 +0000 UTC m=+1181.580612446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "metrics-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.592416 4823 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.592477 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:03.092452636 +0000 UTC m=+1181.581018759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.607825 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsfbr\" (UniqueName: \"kubernetes.io/projected/22746af9-8023-44ba-8377-e35e048923fe-kube-api-access-gsfbr\") pod \"test-operator-controller-manager-756ccf86c7-dr886\" (UID: \"22746af9-8023-44ba-8377-e35e048923fe\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.624086 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hss2f\" (UniqueName: \"kubernetes.io/projected/3378ca15-f3fb-410e-a3fe-96b21dfce8d8-kube-api-access-hss2f\") pod \"rabbitmq-cluster-operator-manager-668c99d594-txmdd\" (UID: \"3378ca15-f3fb-410e-a3fe-96b21dfce8d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.624615 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp42c\" (UniqueName: \"kubernetes.io/projected/8fbf843f-253e-46b0-944e-7e4055e7ecdb-kube-api-access-gp42c\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.712473 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.714460 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.721421 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.876586 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.948282 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9"] Dec 16 07:15:02 crc kubenswrapper[4823]: I1216 07:15:02.950428 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.950806 4823 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:02 crc kubenswrapper[4823]: E1216 07:15:02.951063 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert podName:c69e81b5-99ea-4629-a61e-5d0e012bd472 nodeName:}" failed. No retries permitted until 2025-12-16 07:15:03.951045395 +0000 UTC m=+1182.439611518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" (UID: "c69e81b5-99ea-4629-a61e-5d0e012bd472") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: I1216 07:15:03.153778 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:03 crc kubenswrapper[4823]: I1216 07:15:03.153875 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.154038 4823 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.154068 4823 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.154160 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:04.154125599 +0000 UTC m=+1182.642691722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "metrics-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.154194 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:04.1541785 +0000 UTC m=+1182.642744623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "webhook-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: I1216 07:15:03.212978 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg"] Dec 16 07:15:03 crc kubenswrapper[4823]: I1216 07:15:03.255924 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.256130 4823 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.256188 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert podName:7f6072b1-7137-4564-9000-aa50b569ceac nodeName:}" failed. No retries permitted until 2025-12-16 07:15:05.256167027 +0000 UTC m=+1183.744733160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert") pod "infra-operator-controller-manager-84b495f78-8rx8h" (UID: "7f6072b1-7137-4564-9000-aa50b569ceac") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: I1216 07:15:03.264430 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:15:03 crc kubenswrapper[4823]: W1216 07:15:03.438263 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b11baa_1136_4fea_869d_e3d8f98bca83.slice/crio-2dd617e51b11b9099b3e188b1fda0c8bd21b6f96ded18cf43b7e3172a605bd6e WatchSource:0}: Error finding container 2dd617e51b11b9099b3e188b1fda0c8bd21b6f96ded18cf43b7e3172a605bd6e: Status 404 returned error can't find the container with id 2dd617e51b11b9099b3e188b1fda0c8bd21b6f96ded18cf43b7e3172a605bd6e Dec 16 07:15:03 crc kubenswrapper[4823]: I1216 07:15:03.959945 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.960657 4823 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:03 crc kubenswrapper[4823]: E1216 07:15:03.960739 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert podName:c69e81b5-99ea-4629-a61e-5d0e012bd472 nodeName:}" failed. No retries permitted until 2025-12-16 07:15:05.960714056 +0000 UTC m=+1184.449280179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" (UID: "c69e81b5-99ea-4629-a61e-5d0e012bd472") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.157271 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" event={"ID":"e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1","Type":"ContainerStarted","Data":"9f2a28a2e601c5270edbcceac062627bdb01b49096ceaeb77785a9eae5b5a96e"} Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.158049 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" event={"ID":"37b11baa-1136-4fea-869d-e3d8f98bca83","Type":"ContainerStarted","Data":"2dd617e51b11b9099b3e188b1fda0c8bd21b6f96ded18cf43b7e3172a605bd6e"} Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.164252 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.164306 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:04 crc kubenswrapper[4823]: E1216 07:15:04.164434 4823 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:15:04 crc kubenswrapper[4823]: E1216 07:15:04.164518 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:06.164474213 +0000 UTC m=+1184.653040336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "webhook-server-cert" not found Dec 16 07:15:04 crc kubenswrapper[4823]: E1216 07:15:04.164820 4823 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:15:04 crc kubenswrapper[4823]: E1216 07:15:04.164918 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:06.164893956 +0000 UTC m=+1184.653460139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "metrics-server-cert" not found Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.331169 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww"] Dec 16 07:15:04 crc kubenswrapper[4823]: W1216 07:15:04.337840 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d59368_9ca6_4327_a979_c4c31903630c.slice/crio-d9ade18c71b0c203662c84b602b71133c792cb827a1db5b88c36b06188866a78 WatchSource:0}: Error finding container d9ade18c71b0c203662c84b602b71133c792cb827a1db5b88c36b06188866a78: Status 404 returned error can't find the container with id d9ade18c71b0c203662c84b602b71133c792cb827a1db5b88c36b06188866a78 Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.433134 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.554910 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-l2h76"] Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.579957 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t292p\" (UniqueName: \"kubernetes.io/projected/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-kube-api-access-t292p\") pod \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.580049 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-config-volume\") pod \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.580129 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-secret-volume\") pod \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\" (UID: \"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6\") " Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.581751 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" (UID: "1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.590761 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" (UID: "1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.590986 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-kube-api-access-t292p" (OuterVolumeSpecName: "kube-api-access-t292p") pod "1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" (UID: "1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6"). InnerVolumeSpecName "kube-api-access-t292p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.686942 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t292p\" (UniqueName: \"kubernetes.io/projected/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-kube-api-access-t292p\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.686991 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.687010 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.816074 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67"] Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.827090 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84"] Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.844080 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-629tj"] Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.860005 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6"] Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.882728 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd"] Dec 16 07:15:04 crc kubenswrapper[4823]: W1216 07:15:04.900145 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a46fccc_0870_4aed_96db_064958d3f0c3.slice/crio-178d73c48f15cb89d84ffbbdcb5ddc0e7d7f2d86b9a97da4d6f360d902c4ed50 WatchSource:0}: Error finding container 178d73c48f15cb89d84ffbbdcb5ddc0e7d7f2d86b9a97da4d6f360d902c4ed50: Status 404 returned error can't find the container with id 178d73c48f15cb89d84ffbbdcb5ddc0e7d7f2d86b9a97da4d6f360d902c4ed50 Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.902631 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx"] Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.964076 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-47v54"] Dec 16 07:15:04 crc kubenswrapper[4823]: W1216 07:15:04.965159 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93451fb_312c_448f_a868_43c05626d74a.slice/crio-7d8c7cb7d3b24bc39d8dc11fbd94a22fa1ef16ce8424b61ae88ff9d97c60a852 WatchSource:0}: Error finding container 7d8c7cb7d3b24bc39d8dc11fbd94a22fa1ef16ce8424b61ae88ff9d97c60a852: Status 404 returned error can't find the container with id 7d8c7cb7d3b24bc39d8dc11fbd94a22fa1ef16ce8424b61ae88ff9d97c60a852 Dec 16 07:15:04 crc kubenswrapper[4823]: I1216 07:15:04.981830 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m"] Dec 16 07:15:04 crc kubenswrapper[4823]: W1216 07:15:04.995828 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201bb612_805a_4516_b18e_41382e5e4c42.slice/crio-443d5e742279d82be5eea03ac7e5ab4c1c86f6b849402a49de338403a6dcd3f8 WatchSource:0}: Error finding container 443d5e742279d82be5eea03ac7e5ab4c1c86f6b849402a49de338403a6dcd3f8: Status 404 returned error can't find the container with id 443d5e742279d82be5eea03ac7e5ab4c1c86f6b849402a49de338403a6dcd3f8 Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.040809 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd"] Dec 16 07:15:05 crc kubenswrapper[4823]: W1216 07:15:05.050701 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3378ca15_f3fb_410e_a3fe_96b21dfce8d8.slice/crio-21743fff99ce520c56674bfe68d7295ec059558905e042a40133bfaed4b5d492 WatchSource:0}: Error finding container 21743fff99ce520c56674bfe68d7295ec059558905e042a40133bfaed4b5d492: Status 404 returned error can't find the container with id 21743fff99ce520c56674bfe68d7295ec059558905e042a40133bfaed4b5d492 Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.053999 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f"] Dec 16 07:15:05 crc kubenswrapper[4823]: W1216 07:15:05.056277 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod862263d5_cd38_4867_a8ce_6a82d3170f48.slice/crio-af6f9c8cb728b9b462e041fcb436852d5e0ccd5dbef942ed5ffa33e32ffdf2b0 WatchSource:0}: Error finding container af6f9c8cb728b9b462e041fcb436852d5e0ccd5dbef942ed5ffa33e32ffdf2b0: Status 404 returned error can't find the container with id af6f9c8cb728b9b462e041fcb436852d5e0ccd5dbef942ed5ffa33e32ffdf2b0 Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.057141 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gsfbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-dr886_openstack-operators(22746af9-8023-44ba-8377-e35e048923fe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.058499 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" podUID="22746af9-8023-44ba-8377-e35e048923fe" Dec 16 07:15:05 crc kubenswrapper[4823]: W1216 07:15:05.065238 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3995bf1e_51ff_4543_ba13_cce941e6caab.slice/crio-1c9870613768eb8c4b50a453948ab230fc79d0804f049692e4875d3d7b504af4 WatchSource:0}: Error finding container 1c9870613768eb8c4b50a453948ab230fc79d0804f049692e4875d3d7b504af4: Status 404 returned error can't find the container with id 1c9870613768eb8c4b50a453948ab230fc79d0804f049692e4875d3d7b504af4 Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.066045 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-czrvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5c7cbf548f-ffflh_openstack-operators(862263d5-cd38-4867-a8ce-6a82d3170f48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:15:05 crc kubenswrapper[4823]: W1216 07:15:05.066166 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfb35eb_0e15_454b_9f27_27c35373793b.slice/crio-a72d53d9b4bde8dbd74f9bacf8f3562f7122cbea97902f975d187cdd3d7ffbc5 WatchSource:0}: Error finding container a72d53d9b4bde8dbd74f9bacf8f3562f7122cbea97902f975d187cdd3d7ffbc5: Status 404 returned error can't find the container with id a72d53d9b4bde8dbd74f9bacf8f3562f7122cbea97902f975d187cdd3d7ffbc5 Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.067442 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" podUID="862263d5-cd38-4867-a8ce-6a82d3170f48" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.067589 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8"] Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.067996 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6xfdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-z7kmz_openstack-operators(b3422264-49a2-4032-8906-b74358a9451d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.069432 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" podUID="b3422264-49a2-4032-8906-b74358a9451d" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.070835 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xmzl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-f4294_openstack-operators(3995bf1e-51ff-4543-ba13-cce941e6caab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.071907 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qb7cd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-55f78b7c4c-sgp6f_openstack-operators(ebfb35eb-0e15-454b-9f27-27c35373793b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.071962 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" podUID="3995bf1e-51ff-4543-ba13-cce941e6caab" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.073062 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" podUID="ebfb35eb-0e15-454b-9f27-27c35373793b" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.081992 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz"] Dec 16 07:15:05 crc kubenswrapper[4823]: W1216 07:15:05.086586 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f9c317_3748_4e9d_a683_1d7fab3949b5.slice/crio-8103276db54c2619e38572fb3efeb502342d617dbac5bc19f9a13001fae4db68 WatchSource:0}: Error finding container 8103276db54c2619e38572fb3efeb502342d617dbac5bc19f9a13001fae4db68: Status 404 returned error can't find the container with id 8103276db54c2619e38572fb3efeb502342d617dbac5bc19f9a13001fae4db68 Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.091176 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-dr886"] Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.098140 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh"] Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.104080 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294"] Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.123665 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl"] Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.167338 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" event={"ID":"8a46fccc-0870-4aed-96db-064958d3f0c3","Type":"ContainerStarted","Data":"178d73c48f15cb89d84ffbbdcb5ddc0e7d7f2d86b9a97da4d6f360d902c4ed50"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.170552 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" event={"ID":"201bb612-805a-4516-b18e-41382e5e4c42","Type":"ContainerStarted","Data":"443d5e742279d82be5eea03ac7e5ab4c1c86f6b849402a49de338403a6dcd3f8"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.171959 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" event={"ID":"f390c9c8-73bb-44e7-aa6f-4501691d8415","Type":"ContainerStarted","Data":"dd1e76e5dd2ff3749eeb83accb2a81b629e25ffab98fefd4213500db9d1632ba"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.175059 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" event={"ID":"9b601555-09ee-46de-8736-e28797436673","Type":"ContainerStarted","Data":"e3b2368c554006e105089c33227c169dea85718b330cbcb63e39c3f98032399b"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.176745 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" event={"ID":"22746af9-8023-44ba-8377-e35e048923fe","Type":"ContainerStarted","Data":"5760cc456d5833dac84a20d5a2e61973a846cf0af6b67a4e5583eb67ce5d4142"} Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.178923 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" podUID="22746af9-8023-44ba-8377-e35e048923fe" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.180706 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" event={"ID":"62b57d47-be40-449a-8503-b86187f19914","Type":"ContainerStarted","Data":"ea6c061ecb7ed0852f4567a672e752e8071b19770f4e2945765f25bbfce2bfbb"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.183212 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" event={"ID":"62d59368-9ca6-4327-a979-c4c31903630c","Type":"ContainerStarted","Data":"d9ade18c71b0c203662c84b602b71133c792cb827a1db5b88c36b06188866a78"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.184843 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" event={"ID":"1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6","Type":"ContainerDied","Data":"485f1234eedde1243278aa9c55ed0c4cca31c00aee8f234f5af7c308c21a09d9"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.184865 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="485f1234eedde1243278aa9c55ed0c4cca31c00aee8f234f5af7c308c21a09d9" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.184915 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.203243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" event={"ID":"3995bf1e-51ff-4543-ba13-cce941e6caab","Type":"ContainerStarted","Data":"1c9870613768eb8c4b50a453948ab230fc79d0804f049692e4875d3d7b504af4"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.204784 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" event={"ID":"f93451fb-312c-448f-a868-43c05626d74a","Type":"ContainerStarted","Data":"7d8c7cb7d3b24bc39d8dc11fbd94a22fa1ef16ce8424b61ae88ff9d97c60a852"} Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.205326 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" podUID="3995bf1e-51ff-4543-ba13-cce941e6caab" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.207842 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" event={"ID":"e5260194-8fc8-4615-bfd5-98210220f074","Type":"ContainerStarted","Data":"9d5d91dbf5467949053d62527cc96610efc9ecac55244bb1e54e96a41e65ab6a"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.210825 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" event={"ID":"ae33bf2e-0415-4ba8-9508-f7c36182aec8","Type":"ContainerStarted","Data":"232bdff2db7a0ad32b7d7f3f7dcc050c2f063ebd6a83431774664f89bcf196b5"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.222729 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" event={"ID":"ebfb35eb-0e15-454b-9f27-27c35373793b","Type":"ContainerStarted","Data":"a72d53d9b4bde8dbd74f9bacf8f3562f7122cbea97902f975d187cdd3d7ffbc5"} Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.224712 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" podUID="ebfb35eb-0e15-454b-9f27-27c35373793b" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.225811 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" event={"ID":"b3422264-49a2-4032-8906-b74358a9451d","Type":"ContainerStarted","Data":"f6339f7ce9fdff244b46e21d8eba9b35f6a737b4859117fdadc59f713fcbc2db"} Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.227792 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" podUID="b3422264-49a2-4032-8906-b74358a9451d" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.231215 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" event={"ID":"862263d5-cd38-4867-a8ce-6a82d3170f48","Type":"ContainerStarted","Data":"af6f9c8cb728b9b462e041fcb436852d5e0ccd5dbef942ed5ffa33e32ffdf2b0"} Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.232574 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" podUID="862263d5-cd38-4867-a8ce-6a82d3170f48" Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.241566 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" event={"ID":"9de24328-0da5-4d0a-a34c-5cd820b35a23","Type":"ContainerStarted","Data":"9d3a5415c9578ee3b11a8b1cbe5e5a07c4933ff990f6735e38859562965d6812"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.250645 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" event={"ID":"e75f878b-9fb0-429b-8d6a-b30b98c1dba5","Type":"ContainerStarted","Data":"f41f03623f63321e2fbf04ed4130d57df66025d285d1f412b59c120f7dabce02"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.253407 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" event={"ID":"73f9c317-3748-4e9d-a683-1d7fab3949b5","Type":"ContainerStarted","Data":"8103276db54c2619e38572fb3efeb502342d617dbac5bc19f9a13001fae4db68"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.259830 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" event={"ID":"3378ca15-f3fb-410e-a3fe-96b21dfce8d8","Type":"ContainerStarted","Data":"21743fff99ce520c56674bfe68d7295ec059558905e042a40133bfaed4b5d492"} Dec 16 07:15:05 crc kubenswrapper[4823]: I1216 07:15:05.298232 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.299165 4823 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:05 crc kubenswrapper[4823]: E1216 07:15:05.299205 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert podName:7f6072b1-7137-4564-9000-aa50b569ceac nodeName:}" failed. No retries permitted until 2025-12-16 07:15:09.299191004 +0000 UTC m=+1187.787757127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert") pod "infra-operator-controller-manager-84b495f78-8rx8h" (UID: "7f6072b1-7137-4564-9000-aa50b569ceac") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: I1216 07:15:06.033661 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.033965 4823 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.034060 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert podName:c69e81b5-99ea-4629-a61e-5d0e012bd472 nodeName:}" failed. No retries permitted until 2025-12-16 07:15:10.034040974 +0000 UTC m=+1188.522607097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" (UID: "c69e81b5-99ea-4629-a61e-5d0e012bd472") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: I1216 07:15:06.236954 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:06 crc kubenswrapper[4823]: I1216 07:15:06.237014 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.237188 4823 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.237550 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:10.237521541 +0000 UTC m=+1188.726087704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "metrics-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.237221 4823 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.238156 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:10.238090009 +0000 UTC m=+1188.726656132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "webhook-server-cert" not found Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.274437 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" podUID="ebfb35eb-0e15-454b-9f27-27c35373793b" Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.274921 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" podUID="862263d5-cd38-4867-a8ce-6a82d3170f48" Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.275015 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" podUID="b3422264-49a2-4032-8906-b74358a9451d" Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.275046 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" podUID="3995bf1e-51ff-4543-ba13-cce941e6caab" Dec 16 07:15:06 crc kubenswrapper[4823]: E1216 07:15:06.275205 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" podUID="22746af9-8023-44ba-8377-e35e048923fe" Dec 16 07:15:09 crc kubenswrapper[4823]: I1216 07:15:09.386611 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:09 crc kubenswrapper[4823]: E1216 07:15:09.386785 4823 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:09 crc kubenswrapper[4823]: E1216 07:15:09.386867 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert podName:7f6072b1-7137-4564-9000-aa50b569ceac nodeName:}" failed. No retries permitted until 2025-12-16 07:15:17.386845839 +0000 UTC m=+1195.875411962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert") pod "infra-operator-controller-manager-84b495f78-8rx8h" (UID: "7f6072b1-7137-4564-9000-aa50b569ceac") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:15:10 crc kubenswrapper[4823]: I1216 07:15:10.099878 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:10 crc kubenswrapper[4823]: E1216 07:15:10.100066 4823 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:10 crc kubenswrapper[4823]: E1216 07:15:10.100139 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert podName:c69e81b5-99ea-4629-a61e-5d0e012bd472 nodeName:}" failed. No retries permitted until 2025-12-16 07:15:18.100121823 +0000 UTC m=+1196.588687946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" (UID: "c69e81b5-99ea-4629-a61e-5d0e012bd472") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:15:10 crc kubenswrapper[4823]: I1216 07:15:10.303889 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:10 crc kubenswrapper[4823]: I1216 07:15:10.304145 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:10 crc kubenswrapper[4823]: E1216 07:15:10.304239 4823 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:15:10 crc kubenswrapper[4823]: E1216 07:15:10.304293 4823 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:15:10 crc kubenswrapper[4823]: E1216 07:15:10.304297 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:18.304281051 +0000 UTC m=+1196.792847174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "webhook-server-cert" not found Dec 16 07:15:10 crc kubenswrapper[4823]: E1216 07:15:10.304375 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs podName:8fbf843f-253e-46b0-944e-7e4055e7ecdb nodeName:}" failed. No retries permitted until 2025-12-16 07:15:18.304352473 +0000 UTC m=+1196.792918656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-qbkkw" (UID: "8fbf843f-253e-46b0-944e-7e4055e7ecdb") : secret "metrics-server-cert" not found Dec 16 07:15:16 crc kubenswrapper[4823]: E1216 07:15:16.262561 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 16 07:15:16 crc kubenswrapper[4823]: E1216 07:15:16.263146 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn5nd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-767f9d7567-tg8ww_openstack-operators(62d59368-9ca6-4327-a979-c4c31903630c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:16 crc kubenswrapper[4823]: E1216 07:15:16.264317 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" podUID="62d59368-9ca6-4327-a979-c4c31903630c" Dec 16 07:15:16 crc kubenswrapper[4823]: E1216 07:15:16.360815 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027\\\"\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" podUID="62d59368-9ca6-4327-a979-c4c31903630c" Dec 16 07:15:17 crc kubenswrapper[4823]: I1216 07:15:17.415821 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:17 crc kubenswrapper[4823]: I1216 07:15:17.420931 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f6072b1-7137-4564-9000-aa50b569ceac-cert\") pod \"infra-operator-controller-manager-84b495f78-8rx8h\" (UID: \"7f6072b1-7137-4564-9000-aa50b569ceac\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:17 crc kubenswrapper[4823]: I1216 07:15:17.461602 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.124681 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.131755 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c69e81b5-99ea-4629-a61e-5d0e012bd472-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt\" (UID: \"c69e81b5-99ea-4629-a61e-5d0e012bd472\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.327434 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.327700 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.332468 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.335946 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8fbf843f-253e-46b0-944e-7e4055e7ecdb-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-qbkkw\" (UID: \"8fbf843f-253e-46b0-944e-7e4055e7ecdb\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.429259 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:18 crc kubenswrapper[4823]: I1216 07:15:18.506374 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:23 crc kubenswrapper[4823]: E1216 07:15:23.983222 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 16 07:15:23 crc kubenswrapper[4823]: E1216 07:15:23.983959 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pxhhc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5fdd9786f7-57b67_openstack-operators(e75f878b-9fb0-429b-8d6a-b30b98c1dba5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:23 crc kubenswrapper[4823]: E1216 07:15:23.985182 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" podUID="e75f878b-9fb0-429b-8d6a-b30b98c1dba5" Dec 16 07:15:24 crc kubenswrapper[4823]: E1216 07:15:24.416819 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" podUID="e75f878b-9fb0-429b-8d6a-b30b98c1dba5" Dec 16 07:15:24 crc kubenswrapper[4823]: E1216 07:15:24.824132 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 16 07:15:24 crc kubenswrapper[4823]: E1216 07:15:24.824355 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xmqnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6ccf486b9-8mh84_openstack-operators(62b57d47-be40-449a-8503-b86187f19914): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:24 crc kubenswrapper[4823]: E1216 07:15:24.825561 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" podUID="62b57d47-be40-449a-8503-b86187f19914" Dec 16 07:15:25 crc kubenswrapper[4823]: E1216 07:15:25.437873 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" podUID="62b57d47-be40-449a-8503-b86187f19914" Dec 16 07:15:26 crc kubenswrapper[4823]: E1216 07:15:26.720875 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 16 07:15:26 crc kubenswrapper[4823]: E1216 07:15:26.721501 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jr5tk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-97d456b9-wrzfd_openstack-operators(f93451fb-312c-448f-a868-43c05626d74a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:26 crc kubenswrapper[4823]: E1216 07:15:26.722669 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" podUID="f93451fb-312c-448f-a868-43c05626d74a" Dec 16 07:15:27 crc kubenswrapper[4823]: E1216 07:15:27.457742 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" podUID="f93451fb-312c-448f-a868-43c05626d74a" Dec 16 07:15:29 crc kubenswrapper[4823]: E1216 07:15:29.348711 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 16 07:15:29 crc kubenswrapper[4823]: E1216 07:15:29.349257 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gq8px,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8665b56d78-47v54_openstack-operators(f390c9c8-73bb-44e7-aa6f-4501691d8415): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:29 crc kubenswrapper[4823]: E1216 07:15:29.350957 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" podUID="f390c9c8-73bb-44e7-aa6f-4501691d8415" Dec 16 07:15:29 crc kubenswrapper[4823]: E1216 07:15:29.469865 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" podUID="f390c9c8-73bb-44e7-aa6f-4501691d8415" Dec 16 07:15:31 crc kubenswrapper[4823]: E1216 07:15:31.168240 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 16 07:15:31 crc kubenswrapper[4823]: E1216 07:15:31.169444 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hss2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-txmdd_openstack-operators(3378ca15-f3fb-410e-a3fe-96b21dfce8d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:31 crc kubenswrapper[4823]: E1216 07:15:31.174444 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" podUID="3378ca15-f3fb-410e-a3fe-96b21dfce8d8" Dec 16 07:15:31 crc kubenswrapper[4823]: E1216 07:15:31.478790 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" podUID="3378ca15-f3fb-410e-a3fe-96b21dfce8d8" Dec 16 07:15:34 crc kubenswrapper[4823]: E1216 07:15:34.292407 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 16 07:15:34 crc kubenswrapper[4823]: E1216 07:15:34.292620 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rcmtn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-rkn7m_openstack-operators(201bb612-805a-4516-b18e-41382e5e4c42): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:15:34 crc kubenswrapper[4823]: E1216 07:15:34.294756 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" podUID="201bb612-805a-4516-b18e-41382e5e4c42" Dec 16 07:15:34 crc kubenswrapper[4823]: E1216 07:15:34.501826 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" podUID="201bb612-805a-4516-b18e-41382e5e4c42" Dec 16 07:15:35 crc kubenswrapper[4823]: I1216 07:15:35.553285 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt"] Dec 16 07:15:35 crc kubenswrapper[4823]: W1216 07:15:35.560832 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69e81b5_99ea_4629_a61e_5d0e012bd472.slice/crio-b7f235b4053ac74ef021e30d19521e425b9252e91cd8e54dd3447daa410f319a WatchSource:0}: Error finding container b7f235b4053ac74ef021e30d19521e425b9252e91cd8e54dd3447daa410f319a: Status 404 returned error can't find the container with id b7f235b4053ac74ef021e30d19521e425b9252e91cd8e54dd3447daa410f319a Dec 16 07:15:35 crc kubenswrapper[4823]: I1216 07:15:35.695971 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h"] Dec 16 07:15:35 crc kubenswrapper[4823]: W1216 07:15:35.709385 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6072b1_7137_4564_9000_aa50b569ceac.slice/crio-457fbfdc1020dd9b21211dd71126972eb60d169b0f8d752066f6f658c2bbe417 WatchSource:0}: Error finding container 457fbfdc1020dd9b21211dd71126972eb60d169b0f8d752066f6f658c2bbe417: Status 404 returned error can't find the container with id 457fbfdc1020dd9b21211dd71126972eb60d169b0f8d752066f6f658c2bbe417 Dec 16 07:15:35 crc kubenswrapper[4823]: I1216 07:15:35.797579 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw"] Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.529276 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" event={"ID":"c69e81b5-99ea-4629-a61e-5d0e012bd472","Type":"ContainerStarted","Data":"b7f235b4053ac74ef021e30d19521e425b9252e91cd8e54dd3447daa410f319a"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.565950 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" event={"ID":"9b601555-09ee-46de-8736-e28797436673","Type":"ContainerStarted","Data":"48bf967a01a683f19e2051dcfb76063aa800a42068158dcd9a87081d4db2b27e"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.567337 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.607707 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" event={"ID":"73f9c317-3748-4e9d-a683-1d7fab3949b5","Type":"ContainerStarted","Data":"8ab00daa7882a27d9b75a13016fa28195c0a476198626127991f6d4a225d2dad"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.608597 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.622148 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" podStartSLOduration=9.47595228 podStartE2EDuration="35.622130068s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.055187517 +0000 UTC m=+1183.543753640" lastFinishedPulling="2025-12-16 07:15:31.201365305 +0000 UTC m=+1209.689931428" observedRunningTime="2025-12-16 07:15:36.619394743 +0000 UTC m=+1215.107960886" watchObservedRunningTime="2025-12-16 07:15:36.622130068 +0000 UTC m=+1215.110696191" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.640208 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" event={"ID":"ebfb35eb-0e15-454b-9f27-27c35373793b","Type":"ContainerStarted","Data":"f10179c66e7db88877ef6499453c47f665994dc78d4e3bbe99c51d91cba553d7"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.641213 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.663471 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" event={"ID":"8a46fccc-0870-4aed-96db-064958d3f0c3","Type":"ContainerStarted","Data":"e52e002bcc51e744193a8402c319a28541656414083c0f101265f858d0ea4ae7"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.664472 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.666292 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" event={"ID":"b3422264-49a2-4032-8906-b74358a9451d","Type":"ContainerStarted","Data":"82a22c8877008a977b0043ca049c787738ce6529c04cf7fa8cbdc2982cf408aa"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.666988 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.668578 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" event={"ID":"e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1","Type":"ContainerStarted","Data":"c497617ebd775c95a6bb372ff04a06e4b2ab290995717d235e96e406aa1261dd"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.669259 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.684985 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" event={"ID":"8fbf843f-253e-46b0-944e-7e4055e7ecdb","Type":"ContainerStarted","Data":"561e9bb9d659b253969108b1ef121b7e74eedf7539ac18a62add47f9212d3fc2"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.685041 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" event={"ID":"8fbf843f-253e-46b0-944e-7e4055e7ecdb","Type":"ContainerStarted","Data":"e3a4093abc799886c9ccdee6f84b1546989af69ee42d4c432f4fbfed18fcb7c4"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.685649 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.695619 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" event={"ID":"62d59368-9ca6-4327-a979-c4c31903630c","Type":"ContainerStarted","Data":"1dd0567fb1d80657fd62081668eac8fdf2bb1cc2f3bed0eb4a953a84291d5983"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.696145 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.706217 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" event={"ID":"22746af9-8023-44ba-8377-e35e048923fe","Type":"ContainerStarted","Data":"b99dff92de22c753bceace7cc13c47f35ffdd7275fbf7160a3abb0c2be34ad43"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.706854 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.712646 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" event={"ID":"7f6072b1-7137-4564-9000-aa50b569ceac","Type":"ContainerStarted","Data":"457fbfdc1020dd9b21211dd71126972eb60d169b0f8d752066f6f658c2bbe417"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.737563 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" podStartSLOduration=9.12714833 podStartE2EDuration="35.737545135s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.090602818 +0000 UTC m=+1183.579168951" lastFinishedPulling="2025-12-16 07:15:31.700999633 +0000 UTC m=+1210.189565756" observedRunningTime="2025-12-16 07:15:36.684571214 +0000 UTC m=+1215.173137327" watchObservedRunningTime="2025-12-16 07:15:36.737545135 +0000 UTC m=+1215.226111258" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.747415 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" event={"ID":"862263d5-cd38-4867-a8ce-6a82d3170f48","Type":"ContainerStarted","Data":"1309116f1baa848ebaec725b7869b95a79d26507f3bef2d102de7f6b950c7aa8"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.747789 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.779755 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" event={"ID":"3995bf1e-51ff-4543-ba13-cce941e6caab","Type":"ContainerStarted","Data":"d73c4941e89273c7e5ab942382dc1b6d5e0a6a485651fd63b3c9bd79804becf9"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.780794 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.815092 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" event={"ID":"e5260194-8fc8-4615-bfd5-98210220f074","Type":"ContainerStarted","Data":"cede3eb203d9d681d8cb99d3eb64e4fd6d6fcd2472ef45fa68e127b3eb048ba7"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.815742 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.899744 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" podStartSLOduration=9.649648015 podStartE2EDuration="35.899727298s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.950246479 +0000 UTC m=+1183.438812592" lastFinishedPulling="2025-12-16 07:15:31.200325752 +0000 UTC m=+1209.688891875" observedRunningTime="2025-12-16 07:15:36.741440297 +0000 UTC m=+1215.230006420" watchObservedRunningTime="2025-12-16 07:15:36.899727298 +0000 UTC m=+1215.388293421" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.899978 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" event={"ID":"37b11baa-1136-4fea-869d-e3d8f98bca83","Type":"ContainerStarted","Data":"97665499fae6d7c55fe1ac2344720301015ce1561927e2b1a0100a9c80cb3f96"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.900584 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.900625 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" podStartSLOduration=7.963377677 podStartE2EDuration="35.900617215s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:03.264083765 +0000 UTC m=+1181.752649888" lastFinishedPulling="2025-12-16 07:15:31.201323303 +0000 UTC m=+1209.689889426" observedRunningTime="2025-12-16 07:15:36.898247201 +0000 UTC m=+1215.386813324" watchObservedRunningTime="2025-12-16 07:15:36.900617215 +0000 UTC m=+1215.389183338" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.912326 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" event={"ID":"9de24328-0da5-4d0a-a34c-5cd820b35a23","Type":"ContainerStarted","Data":"5553f21db9eee18b2aa91c245e687991801838ce83b90fbc061ba92bf9b462c6"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.913161 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.931685 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" event={"ID":"ae33bf2e-0415-4ba8-9508-f7c36182aec8","Type":"ContainerStarted","Data":"63b271be0bbdde1e96d5d1e7e00b7721e929d0bb036c9281605a89b4c1be27d9"} Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.932348 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.956397 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" podStartSLOduration=5.769956448 podStartE2EDuration="35.956374883s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.067897246 +0000 UTC m=+1183.556463369" lastFinishedPulling="2025-12-16 07:15:35.254315681 +0000 UTC m=+1213.742881804" observedRunningTime="2025-12-16 07:15:36.948191986 +0000 UTC m=+1215.436758109" watchObservedRunningTime="2025-12-16 07:15:36.956374883 +0000 UTC m=+1215.444940996" Dec 16 07:15:36 crc kubenswrapper[4823]: I1216 07:15:36.996615 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" podStartSLOduration=5.787274219 podStartE2EDuration="35.996592503s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.071841749 +0000 UTC m=+1183.560407872" lastFinishedPulling="2025-12-16 07:15:35.281160033 +0000 UTC m=+1213.769726156" observedRunningTime="2025-12-16 07:15:36.992619419 +0000 UTC m=+1215.481185542" watchObservedRunningTime="2025-12-16 07:15:36.996592503 +0000 UTC m=+1215.485158626" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.017670 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" podStartSLOduration=8.887582931 podStartE2EDuration="36.017655104s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.570939401 +0000 UTC m=+1183.059505514" lastFinishedPulling="2025-12-16 07:15:31.701011564 +0000 UTC m=+1210.189577687" observedRunningTime="2025-12-16 07:15:37.014309809 +0000 UTC m=+1215.502875932" watchObservedRunningTime="2025-12-16 07:15:37.017655104 +0000 UTC m=+1215.506221227" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.067466 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" podStartSLOduration=9.332013599 podStartE2EDuration="36.067450724s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.966420495 +0000 UTC m=+1183.454986618" lastFinishedPulling="2025-12-16 07:15:31.70185762 +0000 UTC m=+1210.190423743" observedRunningTime="2025-12-16 07:15:37.065927876 +0000 UTC m=+1215.554493999" watchObservedRunningTime="2025-12-16 07:15:37.067450724 +0000 UTC m=+1215.556016847" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.152746 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" podStartSLOduration=8.408811268000001 podStartE2EDuration="36.152729497s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:03.458006183 +0000 UTC m=+1181.946572306" lastFinishedPulling="2025-12-16 07:15:31.201924412 +0000 UTC m=+1209.690490535" observedRunningTime="2025-12-16 07:15:37.10974291 +0000 UTC m=+1215.598309043" watchObservedRunningTime="2025-12-16 07:15:37.152729497 +0000 UTC m=+1215.641295610" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.156939 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" podStartSLOduration=35.156929099 podStartE2EDuration="35.156929099s" podCreationTimestamp="2025-12-16 07:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:37.154161882 +0000 UTC m=+1215.642727995" watchObservedRunningTime="2025-12-16 07:15:37.156929099 +0000 UTC m=+1215.645495222" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.177299 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" podStartSLOduration=9.426487641 podStartE2EDuration="36.177278396s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.950629591 +0000 UTC m=+1183.439195714" lastFinishedPulling="2025-12-16 07:15:31.701420346 +0000 UTC m=+1210.189986469" observedRunningTime="2025-12-16 07:15:37.165126025 +0000 UTC m=+1215.653692178" watchObservedRunningTime="2025-12-16 07:15:37.177278396 +0000 UTC m=+1215.665844519" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.190877 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" podStartSLOduration=5.882444843 podStartE2EDuration="36.190857662s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.065665156 +0000 UTC m=+1183.554231279" lastFinishedPulling="2025-12-16 07:15:35.374077965 +0000 UTC m=+1213.862644098" observedRunningTime="2025-12-16 07:15:37.181410306 +0000 UTC m=+1215.669976429" watchObservedRunningTime="2025-12-16 07:15:37.190857662 +0000 UTC m=+1215.679423785" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.208947 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" podStartSLOduration=5.424291814 podStartE2EDuration="36.208926788s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.342557304 +0000 UTC m=+1182.831123427" lastFinishedPulling="2025-12-16 07:15:35.127192278 +0000 UTC m=+1213.615758401" observedRunningTime="2025-12-16 07:15:37.194058143 +0000 UTC m=+1215.682624266" watchObservedRunningTime="2025-12-16 07:15:37.208926788 +0000 UTC m=+1215.697492911" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.225784 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" podStartSLOduration=6.00075876 podStartE2EDuration="36.225761975s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.056875471 +0000 UTC m=+1183.545441594" lastFinishedPulling="2025-12-16 07:15:35.281878686 +0000 UTC m=+1213.770444809" observedRunningTime="2025-12-16 07:15:37.219481428 +0000 UTC m=+1215.708047551" watchObservedRunningTime="2025-12-16 07:15:37.225761975 +0000 UTC m=+1215.714328108" Dec 16 07:15:37 crc kubenswrapper[4823]: I1216 07:15:37.651377 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" podStartSLOduration=6.441018508 podStartE2EDuration="36.651328762s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.070757226 +0000 UTC m=+1183.559323349" lastFinishedPulling="2025-12-16 07:15:35.28106748 +0000 UTC m=+1213.769633603" observedRunningTime="2025-12-16 07:15:37.646913954 +0000 UTC m=+1216.135480087" watchObservedRunningTime="2025-12-16 07:15:37.651328762 +0000 UTC m=+1216.139894905" Dec 16 07:15:38 crc kubenswrapper[4823]: I1216 07:15:38.041174 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" event={"ID":"e75f878b-9fb0-429b-8d6a-b30b98c1dba5","Type":"ContainerStarted","Data":"ae21df593f9b93dace342ba4f303f083fe3c538b726bc0b000378c8964cc618d"} Dec 16 07:15:38 crc kubenswrapper[4823]: I1216 07:15:38.079356 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" podStartSLOduration=4.39701295 podStartE2EDuration="37.079334486s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.965197918 +0000 UTC m=+1183.453764041" lastFinishedPulling="2025-12-16 07:15:37.647519454 +0000 UTC m=+1216.136085577" observedRunningTime="2025-12-16 07:15:38.076147266 +0000 UTC m=+1216.564713409" watchObservedRunningTime="2025-12-16 07:15:38.079334486 +0000 UTC m=+1216.567900609" Dec 16 07:15:41 crc kubenswrapper[4823]: I1216 07:15:41.725273 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-l6tn9" Dec 16 07:15:41 crc kubenswrapper[4823]: I1216 07:15:41.727952 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-95949466-l2h76" Dec 16 07:15:41 crc kubenswrapper[4823]: I1216 07:15:41.765556 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-tg8ww" Dec 16 07:15:41 crc kubenswrapper[4823]: I1216 07:15:41.803196 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-fvsjg" Dec 16 07:15:41 crc kubenswrapper[4823]: I1216 07:15:41.809856 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-26qs6" Dec 16 07:15:41 crc kubenswrapper[4823]: I1216 07:15:41.920800 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-629tj" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.014643 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-ffflh" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.081104 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" event={"ID":"c69e81b5-99ea-4629-a61e-5d0e012bd472","Type":"ContainerStarted","Data":"bacb53c80ebc123da1adb62c95ef7072298ec82f9189d9d92bafcd29e24d383b"} Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.082088 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.083409 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" event={"ID":"7f6072b1-7137-4564-9000-aa50b569ceac","Type":"ContainerStarted","Data":"31982d98af8f8c719d3bee8830f5feb54844821e69a8393c105c6cab3335b6b2"} Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.083814 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.085183 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" event={"ID":"62b57d47-be40-449a-8503-b86187f19914","Type":"ContainerStarted","Data":"a8eb3323b59e8df10e906d499d1628fddaf8a2b6aad66fee6b5fb5d1d557c670"} Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.085604 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.087238 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" event={"ID":"f390c9c8-73bb-44e7-aa6f-4501691d8415","Type":"ContainerStarted","Data":"f7c34670311d5c52c518e5c0986fa7ac8eb7236c5439e7a01e62290a37aa09a7"} Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.087647 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.089530 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" event={"ID":"f93451fb-312c-448f-a868-43c05626d74a","Type":"ContainerStarted","Data":"91399c2950371b3e41ee030e5a671fe873723fb7ab6b817f1a26f0946360eaf2"} Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.089989 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.130387 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.135670 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-57b67" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.148059 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" podStartSLOduration=35.475347009 podStartE2EDuration="41.148016646s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:35.570069508 +0000 UTC m=+1214.058635631" lastFinishedPulling="2025-12-16 07:15:41.242739145 +0000 UTC m=+1219.731305268" observedRunningTime="2025-12-16 07:15:42.142172233 +0000 UTC m=+1220.630738376" watchObservedRunningTime="2025-12-16 07:15:42.148016646 +0000 UTC m=+1220.636582769" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.157972 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" podStartSLOduration=4.806281495 podStartE2EDuration="41.157954067s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.038780193 +0000 UTC m=+1183.527346316" lastFinishedPulling="2025-12-16 07:15:41.390452765 +0000 UTC m=+1219.879018888" observedRunningTime="2025-12-16 07:15:42.154052736 +0000 UTC m=+1220.642618859" watchObservedRunningTime="2025-12-16 07:15:42.157954067 +0000 UTC m=+1220.646520190" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.163155 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-z7kmz" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.179660 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" podStartSLOduration=4.812061237 podStartE2EDuration="41.179640708s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.873952907 +0000 UTC m=+1183.362519040" lastFinishedPulling="2025-12-16 07:15:41.241532388 +0000 UTC m=+1219.730098511" observedRunningTime="2025-12-16 07:15:42.175102765 +0000 UTC m=+1220.663668888" watchObservedRunningTime="2025-12-16 07:15:42.179640708 +0000 UTC m=+1220.668206831" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.229848 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" podStartSLOduration=35.706633976 podStartE2EDuration="41.22982647s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:35.71822167 +0000 UTC m=+1214.206787793" lastFinishedPulling="2025-12-16 07:15:41.241414164 +0000 UTC m=+1219.729980287" observedRunningTime="2025-12-16 07:15:42.224758312 +0000 UTC m=+1220.713324445" watchObservedRunningTime="2025-12-16 07:15:42.22982647 +0000 UTC m=+1220.718392593" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.230685 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" podStartSLOduration=4.981063113 podStartE2EDuration="41.230677776s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:04.993173984 +0000 UTC m=+1183.481740107" lastFinishedPulling="2025-12-16 07:15:41.242788647 +0000 UTC m=+1219.731354770" observedRunningTime="2025-12-16 07:15:42.209359289 +0000 UTC m=+1220.697925412" watchObservedRunningTime="2025-12-16 07:15:42.230677776 +0000 UTC m=+1220.719243899" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.249314 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8n9sx" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.368371 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-btpw8" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.476094 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-7pmnl" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.557165 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-f4294" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.866788 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-dr886" Dec 16 07:15:42 crc kubenswrapper[4823]: I1216 07:15:42.867984 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-sgp6f" Dec 16 07:15:44 crc kubenswrapper[4823]: I1216 07:15:44.105622 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" event={"ID":"3378ca15-f3fb-410e-a3fe-96b21dfce8d8","Type":"ContainerStarted","Data":"085afef98c4ee1201e62e1a63e8cce6d7b5702a39a9c038d313fc3f981275d41"} Dec 16 07:15:44 crc kubenswrapper[4823]: I1216 07:15:44.126287 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-txmdd" podStartSLOduration=3.88809332 podStartE2EDuration="42.126266993s" podCreationTimestamp="2025-12-16 07:15:02 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.053904827 +0000 UTC m=+1183.542470950" lastFinishedPulling="2025-12-16 07:15:43.2920785 +0000 UTC m=+1221.780644623" observedRunningTime="2025-12-16 07:15:44.122283099 +0000 UTC m=+1222.610849222" watchObservedRunningTime="2025-12-16 07:15:44.126266993 +0000 UTC m=+1222.614833116" Dec 16 07:15:47 crc kubenswrapper[4823]: I1216 07:15:47.468263 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84b495f78-8rx8h" Dec 16 07:15:48 crc kubenswrapper[4823]: I1216 07:15:48.435017 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt" Dec 16 07:15:48 crc kubenswrapper[4823]: I1216 07:15:48.512758 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-qbkkw" Dec 16 07:15:51 crc kubenswrapper[4823]: I1216 07:15:51.156726 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" event={"ID":"201bb612-805a-4516-b18e-41382e5e4c42","Type":"ContainerStarted","Data":"c3596a8097b1b677fd43573f7cb00fa02937e1f3d2ed1853f363fdacf94cc86d"} Dec 16 07:15:51 crc kubenswrapper[4823]: I1216 07:15:51.158014 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:15:51 crc kubenswrapper[4823]: I1216 07:15:51.172506 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" podStartSLOduration=4.91265518 podStartE2EDuration="50.172484498s" podCreationTimestamp="2025-12-16 07:15:01 +0000 UTC" firstStartedPulling="2025-12-16 07:15:05.039432244 +0000 UTC m=+1183.527998367" lastFinishedPulling="2025-12-16 07:15:50.299261562 +0000 UTC m=+1228.787827685" observedRunningTime="2025-12-16 07:15:51.17029246 +0000 UTC m=+1229.658858603" watchObservedRunningTime="2025-12-16 07:15:51.172484498 +0000 UTC m=+1229.661050621" Dec 16 07:15:51 crc kubenswrapper[4823]: I1216 07:15:51.828542 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8mh84" Dec 16 07:15:52 crc kubenswrapper[4823]: I1216 07:15:52.510308 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-47v54" Dec 16 07:15:52 crc kubenswrapper[4823]: I1216 07:15:52.716472 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-wrzfd" Dec 16 07:16:02 crc kubenswrapper[4823]: I1216 07:16:02.230785 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-rkn7m" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.234779 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-7m8m9"] Dec 16 07:16:17 crc kubenswrapper[4823]: E1216 07:16:17.235588 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" containerName="collect-profiles" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.235605 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" containerName="collect-profiles" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.235786 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" containerName="collect-profiles" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.236490 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.242658 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.243286 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.243414 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.246258 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zmkxx" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.253569 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-7m8m9"] Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.317612 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-kt72s"] Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.320666 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.327258 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.340346 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-kt72s"] Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.393646 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625ed164-d2cc-45d3-b977-eea43da4cf51-config\") pod \"dnsmasq-dns-84bb9d8bd9-7m8m9\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.393723 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfc29\" (UniqueName: \"kubernetes.io/projected/625ed164-d2cc-45d3-b977-eea43da4cf51-kube-api-access-jfc29\") pod \"dnsmasq-dns-84bb9d8bd9-7m8m9\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.494600 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvkw\" (UniqueName: \"kubernetes.io/projected/36de76e9-6942-4148-a52f-423e6b0b2a18-kube-api-access-vgvkw\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.495089 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625ed164-d2cc-45d3-b977-eea43da4cf51-config\") pod \"dnsmasq-dns-84bb9d8bd9-7m8m9\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.496100 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfc29\" (UniqueName: \"kubernetes.io/projected/625ed164-d2cc-45d3-b977-eea43da4cf51-kube-api-access-jfc29\") pod \"dnsmasq-dns-84bb9d8bd9-7m8m9\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.496444 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-config\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.496553 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-dns-svc\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.496053 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625ed164-d2cc-45d3-b977-eea43da4cf51-config\") pod \"dnsmasq-dns-84bb9d8bd9-7m8m9\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.529744 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfc29\" (UniqueName: \"kubernetes.io/projected/625ed164-d2cc-45d3-b977-eea43da4cf51-kube-api-access-jfc29\") pod \"dnsmasq-dns-84bb9d8bd9-7m8m9\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.557064 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.597284 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-config\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.598794 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-dns-svc\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.599592 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvkw\" (UniqueName: \"kubernetes.io/projected/36de76e9-6942-4148-a52f-423e6b0b2a18-kube-api-access-vgvkw\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.599511 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-dns-svc\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.598751 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-config\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.619690 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvkw\" (UniqueName: \"kubernetes.io/projected/36de76e9-6942-4148-a52f-423e6b0b2a18-kube-api-access-vgvkw\") pod \"dnsmasq-dns-5f854695bc-kt72s\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.646737 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:17 crc kubenswrapper[4823]: I1216 07:16:17.863951 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-7m8m9"] Dec 16 07:16:18 crc kubenswrapper[4823]: I1216 07:16:18.175326 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-kt72s"] Dec 16 07:16:18 crc kubenswrapper[4823]: I1216 07:16:18.348410 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" event={"ID":"36de76e9-6942-4148-a52f-423e6b0b2a18","Type":"ContainerStarted","Data":"d5050ef95a36ebf6f243e0068f730d75c97df989f3f962a82c6369df577aa6c0"} Dec 16 07:16:18 crc kubenswrapper[4823]: I1216 07:16:18.349730 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" event={"ID":"625ed164-d2cc-45d3-b977-eea43da4cf51","Type":"ContainerStarted","Data":"73ec00f5f63dc8f18c73b267a6b980da194899a2b29f7ca4cdd76a8d2342e325"} Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.639253 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-kt72s"] Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.690974 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-tl9kf"] Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.693059 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.709894 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-tl9kf"] Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.741062 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-config\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.741140 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.741207 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnt5f\" (UniqueName: \"kubernetes.io/projected/ff169a34-dc27-40ca-86ca-ba5e8f644502-kube-api-access-bnt5f\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.842932 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-config\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.843024 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.843094 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnt5f\" (UniqueName: \"kubernetes.io/projected/ff169a34-dc27-40ca-86ca-ba5e8f644502-kube-api-access-bnt5f\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.844184 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.844353 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-config\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.890086 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnt5f\" (UniqueName: \"kubernetes.io/projected/ff169a34-dc27-40ca-86ca-ba5e8f644502-kube-api-access-bnt5f\") pod \"dnsmasq-dns-c7cbb8f79-tl9kf\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:19 crc kubenswrapper[4823]: I1216 07:16:19.970598 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-7m8m9"] Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.003583 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-jqbcz"] Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.005017 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.015384 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-jqbcz"] Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.026222 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.044816 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-config\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.044861 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-dns-svc\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.044934 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mvz\" (UniqueName: \"kubernetes.io/projected/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-kube-api-access-w4mvz\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.146590 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-dns-svc\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.146690 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mvz\" (UniqueName: \"kubernetes.io/projected/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-kube-api-access-w4mvz\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.146758 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-config\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.147681 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-config\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.148248 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-dns-svc\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.174709 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mvz\" (UniqueName: \"kubernetes.io/projected/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-kube-api-access-w4mvz\") pod \"dnsmasq-dns-95f5f6995-jqbcz\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.347452 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.825142 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.826651 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.834014 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.834092 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.834356 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.834363 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.834519 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.834630 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbwj5" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.838228 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.854618 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:20 crc kubenswrapper[4823]: I1216 07:16:20.893220 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-tl9kf"] Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.014623 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.014715 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwr5v\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-kube-api-access-kwr5v\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.014779 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.014837 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.014857 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.015665 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.015700 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.015742 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.016673 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.016745 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.016783 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.118282 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.118323 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.118347 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.118367 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.118389 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.119440 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.119798 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.120435 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122132 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122239 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122278 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122327 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122376 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwr5v\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-kube-api-access-kwr5v\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122438 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.122590 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.123192 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.124727 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.130686 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.133796 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.136134 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.136658 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.137603 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.138372 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.138639 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.138653 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.138840 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.139685 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.139707 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-svz6s" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.139810 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.140836 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.142942 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.148340 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwr5v\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-kube-api-access-kwr5v\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.152175 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.170620 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.198664 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-jqbcz"] Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.324964 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325005 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325030 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325063 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325094 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325171 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vq4j\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-kube-api-access-9vq4j\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325207 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325258 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325303 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325371 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a686a945-8fa0-406c-ac01-cf061c865a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.325430 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a686a945-8fa0-406c-ac01-cf061c865a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.394562 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" event={"ID":"ff169a34-dc27-40ca-86ca-ba5e8f644502","Type":"ContainerStarted","Data":"14ebaf545ed1186ba91c1abc6be3f10edce3ad3e9b902760e07af3db9b204a67"} Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.398682 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" event={"ID":"408c33cd-064f-42e1-b3b5-a2c1b7046f0c","Type":"ContainerStarted","Data":"12251f8e6a6b3eee048fdd603b30610930747609dafadcc3b005264ede9a2e8d"} Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427094 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427172 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vq4j\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-kube-api-access-9vq4j\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427204 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427245 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427282 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427318 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a686a945-8fa0-406c-ac01-cf061c865a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427377 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a686a945-8fa0-406c-ac01-cf061c865a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427423 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427500 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427554 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.427641 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.429354 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.429617 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.430085 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.430279 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.430315 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.440819 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.442252 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a686a945-8fa0-406c-ac01-cf061c865a28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.444798 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a686a945-8fa0-406c-ac01-cf061c865a28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.448379 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.451731 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vq4j\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-kube-api-access-9vq4j\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.457466 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.569094 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4823]: I1216 07:16:21.937647 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.349382 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.363129 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.365745 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.369988 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.371863 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ncfcv" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.372312 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.374622 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.383107 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.564503 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1","Type":"ContainerStarted","Data":"c545d7e12c64e5493278719a7106677c5060fbade8234638011f610fd4d1cfab"} Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.641468 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673073 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673464 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673534 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673646 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghctv\" (UniqueName: \"kubernetes.io/projected/dbcff04b-7d0d-45b4-bc28-7882421c6000-kube-api-access-ghctv\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673681 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673729 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673803 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.673875 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: W1216 07:16:22.696863 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda686a945_8fa0_406c_ac01_cf061c865a28.slice/crio-d342eaa90ec3f7fc03cef38dfcf7f773219dea63e67185b44ac6dff967b46a73 WatchSource:0}: Error finding container d342eaa90ec3f7fc03cef38dfcf7f773219dea63e67185b44ac6dff967b46a73: Status 404 returned error can't find the container with id d342eaa90ec3f7fc03cef38dfcf7f773219dea63e67185b44ac6dff967b46a73 Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775686 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775758 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775804 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775850 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghctv\" (UniqueName: \"kubernetes.io/projected/dbcff04b-7d0d-45b4-bc28-7882421c6000-kube-api-access-ghctv\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775893 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775916 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.775970 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.776001 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.777340 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.778325 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.778415 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.778494 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.778842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.786344 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.806835 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghctv\" (UniqueName: \"kubernetes.io/projected/dbcff04b-7d0d-45b4-bc28-7882421c6000-kube-api-access-ghctv\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.808200 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4823]: I1216 07:16:22.808937 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " pod="openstack/openstack-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:22.998279 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.591661 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a686a945-8fa0-406c-ac01-cf061c865a28","Type":"ContainerStarted","Data":"d342eaa90ec3f7fc03cef38dfcf7f773219dea63e67185b44ac6dff967b46a73"} Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.605816 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:16:23 crc kubenswrapper[4823]: W1216 07:16:23.699304 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbcff04b_7d0d_45b4_bc28_7882421c6000.slice/crio-477e61703af31d689c7c23af31872ff1ab2c4ed808379217e613c31f40aa13d3 WatchSource:0}: Error finding container 477e61703af31d689c7c23af31872ff1ab2c4ed808379217e613c31f40aa13d3: Status 404 returned error can't find the container with id 477e61703af31d689c7c23af31872ff1ab2c4ed808379217e613c31f40aa13d3 Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.719528 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.720797 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.725013 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kl75v" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.725198 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.725322 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.725328 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.737318 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.873254 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.874443 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.884058 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.884790 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.885595 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pq5nz" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.900233 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919419 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919536 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919574 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919623 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919685 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whqlv\" (UniqueName: \"kubernetes.io/projected/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kube-api-access-whqlv\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919717 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919781 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:23 crc kubenswrapper[4823]: I1216 07:16:23.919798 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021013 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021114 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021153 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kolla-config\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021189 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021230 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whqlv\" (UniqueName: \"kubernetes.io/projected/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kube-api-access-whqlv\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021610 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021631 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-config-data\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021662 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021705 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021979 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.022239 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.021986 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.022693 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.022749 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.022875 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.022916 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd8pk\" (UniqueName: \"kubernetes.io/projected/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kube-api-access-qd8pk\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.023604 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.030757 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.030881 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.046886 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.057240 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whqlv\" (UniqueName: \"kubernetes.io/projected/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kube-api-access-whqlv\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.080586 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.123797 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd8pk\" (UniqueName: \"kubernetes.io/projected/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kube-api-access-qd8pk\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.123840 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.123880 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kolla-config\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.124050 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-config-data\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.124097 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.124894 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kolla-config\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.125434 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-config-data\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.147641 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd8pk\" (UniqueName: \"kubernetes.io/projected/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kube-api-access-qd8pk\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.148835 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.149403 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.197596 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.354003 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.605554 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbcff04b-7d0d-45b4-bc28-7882421c6000","Type":"ContainerStarted","Data":"477e61703af31d689c7c23af31872ff1ab2c4ed808379217e613c31f40aa13d3"} Dec 16 07:16:24 crc kubenswrapper[4823]: I1216 07:16:24.888440 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 07:16:24 crc kubenswrapper[4823]: W1216 07:16:24.909303 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eee92de_9c0e_4afd_8a27_52d82caa27ad.slice/crio-b1b1b327a28624e923bddebeefdae9b7ba095e1e0f973a89b6756076f00dfaef WatchSource:0}: Error finding container b1b1b327a28624e923bddebeefdae9b7ba095e1e0f973a89b6756076f00dfaef: Status 404 returned error can't find the container with id b1b1b327a28624e923bddebeefdae9b7ba095e1e0f973a89b6756076f00dfaef Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.225303 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:16:25 crc kubenswrapper[4823]: W1216 07:16:25.262223 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45a2fe80_7cf2_4419_91c9_3c958d33d5a8.slice/crio-d649e376b8690bada7045f5b0459236523b60396dcf2c13df59b05b65cdff845 WatchSource:0}: Error finding container d649e376b8690bada7045f5b0459236523b60396dcf2c13df59b05b65cdff845: Status 404 returned error can't find the container with id d649e376b8690bada7045f5b0459236523b60396dcf2c13df59b05b65cdff845 Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.622550 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3eee92de-9c0e-4afd-8a27-52d82caa27ad","Type":"ContainerStarted","Data":"b1b1b327a28624e923bddebeefdae9b7ba095e1e0f973a89b6756076f00dfaef"} Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.641677 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45a2fe80-7cf2-4419-91c9-3c958d33d5a8","Type":"ContainerStarted","Data":"d649e376b8690bada7045f5b0459236523b60396dcf2c13df59b05b65cdff845"} Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.857495 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.861680 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.867266 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9qttx" Dec 16 07:16:25 crc kubenswrapper[4823]: I1216 07:16:25.879961 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:16:26 crc kubenswrapper[4823]: I1216 07:16:26.012603 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xllpk\" (UniqueName: \"kubernetes.io/projected/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0-kube-api-access-xllpk\") pod \"kube-state-metrics-0\" (UID: \"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0\") " pod="openstack/kube-state-metrics-0" Dec 16 07:16:26 crc kubenswrapper[4823]: I1216 07:16:26.114082 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xllpk\" (UniqueName: \"kubernetes.io/projected/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0-kube-api-access-xllpk\") pod \"kube-state-metrics-0\" (UID: \"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0\") " pod="openstack/kube-state-metrics-0" Dec 16 07:16:26 crc kubenswrapper[4823]: I1216 07:16:26.137172 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xllpk\" (UniqueName: \"kubernetes.io/projected/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0-kube-api-access-xllpk\") pod \"kube-state-metrics-0\" (UID: \"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0\") " pod="openstack/kube-state-metrics-0" Dec 16 07:16:26 crc kubenswrapper[4823]: I1216 07:16:26.183747 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:16:26 crc kubenswrapper[4823]: I1216 07:16:26.847663 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:16:26 crc kubenswrapper[4823]: W1216 07:16:26.848106 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db2b8b4_03e8_4ae0_875d_5f3a6414d0e0.slice/crio-a68940f057a874f65624bd9e9430a72529863e9fe47ff0ba2bb0d29c6db815ac WatchSource:0}: Error finding container a68940f057a874f65624bd9e9430a72529863e9fe47ff0ba2bb0d29c6db815ac: Status 404 returned error can't find the container with id a68940f057a874f65624bd9e9430a72529863e9fe47ff0ba2bb0d29c6db815ac Dec 16 07:16:27 crc kubenswrapper[4823]: I1216 07:16:27.671625 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0","Type":"ContainerStarted","Data":"a68940f057a874f65624bd9e9430a72529863e9fe47ff0ba2bb0d29c6db815ac"} Dec 16 07:16:28 crc kubenswrapper[4823]: I1216 07:16:28.134180 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:16:28 crc kubenswrapper[4823]: I1216 07:16:28.134509 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.541220 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fvqqp"] Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.544201 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.545752 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-spsqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.546185 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.546421 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.552564 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvqqp"] Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.593640 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-29jcz"] Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.595962 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.600760 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-29jcz"] Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646168 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646214 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb9072-dfce-44ca-88d3-64136ac7e1c3-scripts\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646235 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-etc-ovs\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646292 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-combined-ca-bundle\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646317 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-ovn-controller-tls-certs\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646442 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-log-ovn\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646547 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-run\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646624 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run-ovn\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646649 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-log\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646679 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-875md\" (UniqueName: \"kubernetes.io/projected/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-kube-api-access-875md\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646738 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-lib\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646802 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pr4\" (UniqueName: \"kubernetes.io/projected/4edb9072-dfce-44ca-88d3-64136ac7e1c3-kube-api-access-d6pr4\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.646882 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-scripts\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748515 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-run\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748590 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run-ovn\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748615 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-log\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748644 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-875md\" (UniqueName: \"kubernetes.io/projected/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-kube-api-access-875md\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748680 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-lib\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748718 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pr4\" (UniqueName: \"kubernetes.io/projected/4edb9072-dfce-44ca-88d3-64136ac7e1c3-kube-api-access-d6pr4\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748764 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-scripts\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748783 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748800 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb9072-dfce-44ca-88d3-64136ac7e1c3-scripts\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748815 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-etc-ovs\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748835 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-combined-ca-bundle\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748855 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-ovn-controller-tls-certs\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.748876 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-log-ovn\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.749432 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-log-ovn\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.749561 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-run\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.749662 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run-ovn\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.749733 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-log\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.750137 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-lib\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.752253 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-etc-ovs\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.752312 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.754154 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb9072-dfce-44ca-88d3-64136ac7e1c3-scripts\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.754508 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-scripts\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.759470 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-ovn-controller-tls-certs\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.766481 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-875md\" (UniqueName: \"kubernetes.io/projected/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-kube-api-access-875md\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.769262 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pr4\" (UniqueName: \"kubernetes.io/projected/4edb9072-dfce-44ca-88d3-64136ac7e1c3-kube-api-access-d6pr4\") pod \"ovn-controller-ovs-29jcz\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.769362 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-combined-ca-bundle\") pod \"ovn-controller-fvqqp\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.889659 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:29 crc kubenswrapper[4823]: I1216 07:16:29.943674 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.424454 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.427245 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.429661 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5v6hk" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.429967 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.429972 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.430011 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.430023 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.448160 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592492 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksj4k\" (UniqueName: \"kubernetes.io/projected/b566f9ee-8a75-4041-aac4-1573ca610541-kube-api-access-ksj4k\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592569 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592630 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-config\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592687 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592738 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592764 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592817 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.592844 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.694780 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.694834 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.694865 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksj4k\" (UniqueName: \"kubernetes.io/projected/b566f9ee-8a75-4041-aac4-1573ca610541-kube-api-access-ksj4k\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.694906 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.694944 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-config\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.694999 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.695068 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.695094 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.695650 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.697199 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.697624 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-config\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.698412 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.700873 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.701779 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.712963 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.715300 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksj4k\" (UniqueName: \"kubernetes.io/projected/b566f9ee-8a75-4041-aac4-1573ca610541-kube-api-access-ksj4k\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.718710 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:30 crc kubenswrapper[4823]: I1216 07:16:30.752271 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.516320 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.530516 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.530600 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.537379 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.540180 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.540386 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-hx976" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.541591 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669302 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669374 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckcq8\" (UniqueName: \"kubernetes.io/projected/603d469a-39a2-4d84-87cb-f2c7499b7a28-kube-api-access-ckcq8\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669398 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669451 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669483 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669519 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669571 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.669592 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-config\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771518 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckcq8\" (UniqueName: \"kubernetes.io/projected/603d469a-39a2-4d84-87cb-f2c7499b7a28-kube-api-access-ckcq8\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771561 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771655 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771693 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771738 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771817 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771830 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-config\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.771909 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.772490 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.772714 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.773323 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.774248 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-config\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.778498 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.779771 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.780606 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.791714 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckcq8\" (UniqueName: \"kubernetes.io/projected/603d469a-39a2-4d84-87cb-f2c7499b7a28-kube-api-access-ckcq8\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.793585 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:33 crc kubenswrapper[4823]: I1216 07:16:33.870674 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:43 crc kubenswrapper[4823]: E1216 07:16:43.580187 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Dec 16 07:16:43 crc kubenswrapper[4823]: E1216 07:16:43.581399 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vq4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(a686a945-8fa0-406c-ac01-cf061c865a28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:43 crc kubenswrapper[4823]: E1216 07:16:43.583219 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" Dec 16 07:16:43 crc kubenswrapper[4823]: E1216 07:16:43.864650 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d\\\"\"" pod="openstack/rabbitmq-server-0" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.004324 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.004837 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwr5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.006262 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.784073 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached@sha256:e47191ba776414b781b3e27b856ab45a03b9480c7dc2b1addb939608794882dc" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.784304 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached@sha256:e47191ba776414b781b3e27b856ab45a03b9480c7dc2b1addb939608794882dc,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n564hbbh544h695hbh5dh674hcch54hf5h574h98h568h7fhf5h58h56dh5dbh6bh549h59dh546h85hbbh57bh56hc7hcfh59dh84h598hf8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qd8pk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(3eee92de-9c0e-4afd-8a27-52d82caa27ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.785536 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.902675 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" Dec 16 07:16:46 crc kubenswrapper[4823]: E1216 07:16:46.903094 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached@sha256:e47191ba776414b781b3e27b856ab45a03b9480c7dc2b1addb939608794882dc\\\"\"" pod="openstack/memcached-0" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.655950 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.656594 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgvkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-kt72s_openstack(36de76e9-6942-4148-a52f-423e6b0b2a18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.657799 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" podUID="36de76e9-6942-4148-a52f-423e6b0b2a18" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.673496 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.673681 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfc29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-7m8m9_openstack(625ed164-d2cc-45d3-b977-eea43da4cf51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.681194 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" podUID="625ed164-d2cc-45d3-b977-eea43da4cf51" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.684963 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.685176 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnt5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-c7cbb8f79-tl9kf_openstack(ff169a34-dc27-40ca-86ca-ba5e8f644502): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.686225 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" podUID="ff169a34-dc27-40ca-86ca-ba5e8f644502" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.819403 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.819668 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4mvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-jqbcz_openstack(408c33cd-064f-42e1-b3b5-a2c1b7046f0c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.823573 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" podUID="408c33cd-064f-42e1-b3b5-a2c1b7046f0c" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.966855 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" podUID="ff169a34-dc27-40ca-86ca-ba5e8f644502" Dec 16 07:16:52 crc kubenswrapper[4823]: E1216 07:16:52.966902 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" podUID="408c33cd-064f-42e1-b3b5-a2c1b7046f0c" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.249696 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvqqp"] Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.297393 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.413043 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-29jcz"] Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.444862 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:53 crc kubenswrapper[4823]: W1216 07:16:53.513889 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4edb9072_dfce_44ca_88d3_64136ac7e1c3.slice/crio-235232dd5c4000cb81bcdc3a65e84dd1780c277e0d516b25d57d5ed080d7f45e WatchSource:0}: Error finding container 235232dd5c4000cb81bcdc3a65e84dd1780c277e0d516b25d57d5ed080d7f45e: Status 404 returned error can't find the container with id 235232dd5c4000cb81bcdc3a65e84dd1780c277e0d516b25d57d5ed080d7f45e Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.522645 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.545242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625ed164-d2cc-45d3-b977-eea43da4cf51-config\") pod \"625ed164-d2cc-45d3-b977-eea43da4cf51\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.545390 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfc29\" (UniqueName: \"kubernetes.io/projected/625ed164-d2cc-45d3-b977-eea43da4cf51-kube-api-access-jfc29\") pod \"625ed164-d2cc-45d3-b977-eea43da4cf51\" (UID: \"625ed164-d2cc-45d3-b977-eea43da4cf51\") " Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.545743 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625ed164-d2cc-45d3-b977-eea43da4cf51-config" (OuterVolumeSpecName: "config") pod "625ed164-d2cc-45d3-b977-eea43da4cf51" (UID: "625ed164-d2cc-45d3-b977-eea43da4cf51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.554013 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625ed164-d2cc-45d3-b977-eea43da4cf51-kube-api-access-jfc29" (OuterVolumeSpecName: "kube-api-access-jfc29") pod "625ed164-d2cc-45d3-b977-eea43da4cf51" (UID: "625ed164-d2cc-45d3-b977-eea43da4cf51"). InnerVolumeSpecName "kube-api-access-jfc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.646957 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgvkw\" (UniqueName: \"kubernetes.io/projected/36de76e9-6942-4148-a52f-423e6b0b2a18-kube-api-access-vgvkw\") pod \"36de76e9-6942-4148-a52f-423e6b0b2a18\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.647105 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-config\") pod \"36de76e9-6942-4148-a52f-423e6b0b2a18\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.647135 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-dns-svc\") pod \"36de76e9-6942-4148-a52f-423e6b0b2a18\" (UID: \"36de76e9-6942-4148-a52f-423e6b0b2a18\") " Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.647794 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36de76e9-6942-4148-a52f-423e6b0b2a18" (UID: "36de76e9-6942-4148-a52f-423e6b0b2a18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.647786 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-config" (OuterVolumeSpecName: "config") pod "36de76e9-6942-4148-a52f-423e6b0b2a18" (UID: "36de76e9-6942-4148-a52f-423e6b0b2a18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.647952 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625ed164-d2cc-45d3-b977-eea43da4cf51-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.647987 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfc29\" (UniqueName: \"kubernetes.io/projected/625ed164-d2cc-45d3-b977-eea43da4cf51-kube-api-access-jfc29\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.649700 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36de76e9-6942-4148-a52f-423e6b0b2a18-kube-api-access-vgvkw" (OuterVolumeSpecName: "kube-api-access-vgvkw") pod "36de76e9-6942-4148-a52f-423e6b0b2a18" (UID: "36de76e9-6942-4148-a52f-423e6b0b2a18"). InnerVolumeSpecName "kube-api-access-vgvkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.749807 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgvkw\" (UniqueName: \"kubernetes.io/projected/36de76e9-6942-4148-a52f-423e6b0b2a18-kube-api-access-vgvkw\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.749857 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.749868 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36de76e9-6942-4148-a52f-423e6b0b2a18-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4823]: E1216 07:16:53.924247 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Dec 16 07:16:53 crc kubenswrapper[4823]: E1216 07:16:53.924560 4823 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Dec 16 07:16:53 crc kubenswrapper[4823]: E1216 07:16:53.924726 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xllpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 07:16:53 crc kubenswrapper[4823]: E1216 07:16:53.925943 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.972910 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerStarted","Data":"235232dd5c4000cb81bcdc3a65e84dd1780c277e0d516b25d57d5ed080d7f45e"} Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.976965 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp" event={"ID":"5fe879e4-70bf-4f38-a4a7-98f5eb23a769","Type":"ContainerStarted","Data":"a2b81c76e6cce262197b4a0317a6d19dc1b5e9e49d911f38fef703c2a4247695"} Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.988347 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" event={"ID":"625ed164-d2cc-45d3-b977-eea43da4cf51","Type":"ContainerDied","Data":"73ec00f5f63dc8f18c73b267a6b980da194899a2b29f7ca4cdd76a8d2342e325"} Dec 16 07:16:53 crc kubenswrapper[4823]: I1216 07:16:53.988446 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-7m8m9" Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.003706 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" event={"ID":"36de76e9-6942-4148-a52f-423e6b0b2a18","Type":"ContainerDied","Data":"d5050ef95a36ebf6f243e0068f730d75c97df989f3f962a82c6369df577aa6c0"} Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.003822 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-kt72s" Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.011474 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbcff04b-7d0d-45b4-bc28-7882421c6000","Type":"ContainerStarted","Data":"2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551"} Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.015190 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b566f9ee-8a75-4041-aac4-1573ca610541","Type":"ContainerStarted","Data":"51f23d92b9b14cdf5a284d17abcda28f72a9586a5fe031540b37af42aff48a7c"} Dec 16 07:16:54 crc kubenswrapper[4823]: E1216 07:16:54.029542 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb\\\"\"" pod="openstack/kube-state-metrics-0" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.060744 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-kt72s"] Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.077992 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-kt72s"] Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.099614 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-7m8m9"] Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.107195 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-7m8m9"] Dec 16 07:16:54 crc kubenswrapper[4823]: I1216 07:16:54.201746 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:16:55 crc kubenswrapper[4823]: I1216 07:16:55.038017 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45a2fe80-7cf2-4419-91c9-3c958d33d5a8","Type":"ContainerStarted","Data":"7fdc30c61c04e114057f4a9e2d6e7879f0b0f75c9c3a5cf2549057bada61a9f0"} Dec 16 07:16:55 crc kubenswrapper[4823]: I1216 07:16:55.039528 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"603d469a-39a2-4d84-87cb-f2c7499b7a28","Type":"ContainerStarted","Data":"0bf498f98b17c62cf40b0e4da2f105da1f7c94abfdaa144c0064b688a016bc7c"} Dec 16 07:16:55 crc kubenswrapper[4823]: I1216 07:16:55.791402 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36de76e9-6942-4148-a52f-423e6b0b2a18" path="/var/lib/kubelet/pods/36de76e9-6942-4148-a52f-423e6b0b2a18/volumes" Dec 16 07:16:55 crc kubenswrapper[4823]: I1216 07:16:55.792189 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625ed164-d2cc-45d3-b977-eea43da4cf51" path="/var/lib/kubelet/pods/625ed164-d2cc-45d3-b977-eea43da4cf51/volumes" Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.067799 4823 generic.go:334] "Generic (PLEG): container finished" podID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerID="2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551" exitCode=0 Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.068416 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbcff04b-7d0d-45b4-bc28-7882421c6000","Type":"ContainerDied","Data":"2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551"} Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.070732 4823 generic.go:334] "Generic (PLEG): container finished" podID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerID="7fdc30c61c04e114057f4a9e2d6e7879f0b0f75c9c3a5cf2549057bada61a9f0" exitCode=0 Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.070795 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45a2fe80-7cf2-4419-91c9-3c958d33d5a8","Type":"ContainerDied","Data":"7fdc30c61c04e114057f4a9e2d6e7879f0b0f75c9c3a5cf2549057bada61a9f0"} Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.073201 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b566f9ee-8a75-4041-aac4-1573ca610541","Type":"ContainerStarted","Data":"58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b"} Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.075062 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerStarted","Data":"4e9fde7fbe0438d93c11da4a80083cc4d8cfbd62271f2ea000e704d3bf65f337"} Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.082932 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp" event={"ID":"5fe879e4-70bf-4f38-a4a7-98f5eb23a769","Type":"ContainerStarted","Data":"0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9"} Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.083253 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fvqqp" Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.091333 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"603d469a-39a2-4d84-87cb-f2c7499b7a28","Type":"ContainerStarted","Data":"90bb6f7603a93a35c6ff65c8dd4f67d20079e1d4acfdcadb0ec6ae63addd6404"} Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.134198 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.134261 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:16:58 crc kubenswrapper[4823]: I1216 07:16:58.155324 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fvqqp" podStartSLOduration=25.120558631 podStartE2EDuration="29.155280866s" podCreationTimestamp="2025-12-16 07:16:29 +0000 UTC" firstStartedPulling="2025-12-16 07:16:53.415223032 +0000 UTC m=+1291.903789145" lastFinishedPulling="2025-12-16 07:16:57.449945257 +0000 UTC m=+1295.938511380" observedRunningTime="2025-12-16 07:16:58.127579819 +0000 UTC m=+1296.616145942" watchObservedRunningTime="2025-12-16 07:16:58.155280866 +0000 UTC m=+1296.643847009" Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.100339 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a686a945-8fa0-406c-ac01-cf061c865a28","Type":"ContainerStarted","Data":"0639ca39d4b510f82c5a92153f15cb0546ff06018f3f66e0dd1e7b8d07959478"} Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.102923 4823 generic.go:334] "Generic (PLEG): container finished" podID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerID="4e9fde7fbe0438d93c11da4a80083cc4d8cfbd62271f2ea000e704d3bf65f337" exitCode=0 Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.102984 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerDied","Data":"4e9fde7fbe0438d93c11da4a80083cc4d8cfbd62271f2ea000e704d3bf65f337"} Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.105376 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbcff04b-7d0d-45b4-bc28-7882421c6000","Type":"ContainerStarted","Data":"310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede"} Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.110140 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45a2fe80-7cf2-4419-91c9-3c958d33d5a8","Type":"ContainerStarted","Data":"28af7097fe36966795ffd4f08fbf3fc9b6142fd27eb3db8592b4ce75e52927e8"} Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.172470 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.835341022 podStartE2EDuration="37.172452465s" podCreationTimestamp="2025-12-16 07:16:22 +0000 UTC" firstStartedPulling="2025-12-16 07:16:25.267594849 +0000 UTC m=+1263.756160962" lastFinishedPulling="2025-12-16 07:16:52.604706292 +0000 UTC m=+1291.093272405" observedRunningTime="2025-12-16 07:16:59.16779746 +0000 UTC m=+1297.656363583" watchObservedRunningTime="2025-12-16 07:16:59.172452465 +0000 UTC m=+1297.661018588" Dec 16 07:16:59 crc kubenswrapper[4823]: I1216 07:16:59.210467 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.480828554 podStartE2EDuration="38.210445554s" podCreationTimestamp="2025-12-16 07:16:21 +0000 UTC" firstStartedPulling="2025-12-16 07:16:23.776858995 +0000 UTC m=+1262.265425118" lastFinishedPulling="2025-12-16 07:16:51.506475995 +0000 UTC m=+1289.995042118" observedRunningTime="2025-12-16 07:16:59.207466542 +0000 UTC m=+1297.696032675" watchObservedRunningTime="2025-12-16 07:16:59.210445554 +0000 UTC m=+1297.699011677" Dec 16 07:17:00 crc kubenswrapper[4823]: I1216 07:17:00.122391 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerStarted","Data":"143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0"} Dec 16 07:17:00 crc kubenswrapper[4823]: I1216 07:17:00.122858 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:17:00 crc kubenswrapper[4823]: I1216 07:17:00.122878 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerStarted","Data":"a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578"} Dec 16 07:17:01 crc kubenswrapper[4823]: I1216 07:17:01.129945 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:17:01 crc kubenswrapper[4823]: I1216 07:17:01.879237 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-29jcz" podStartSLOduration=28.948289277 podStartE2EDuration="32.879216003s" podCreationTimestamp="2025-12-16 07:16:29 +0000 UTC" firstStartedPulling="2025-12-16 07:16:53.516390279 +0000 UTC m=+1292.004956402" lastFinishedPulling="2025-12-16 07:16:57.447317005 +0000 UTC m=+1295.935883128" observedRunningTime="2025-12-16 07:17:00.154410243 +0000 UTC m=+1298.642976366" watchObservedRunningTime="2025-12-16 07:17:01.879216003 +0000 UTC m=+1300.367782126" Dec 16 07:17:02 crc kubenswrapper[4823]: I1216 07:17:02.999417 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 07:17:02 crc kubenswrapper[4823]: I1216 07:17:02.999774 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.146807 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1","Type":"ContainerStarted","Data":"bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae"} Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.148848 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b566f9ee-8a75-4041-aac4-1573ca610541","Type":"ContainerStarted","Data":"ad6913219d4e64984189276d714fe66372819d7e73a5bd2b7c37eef8a55f9181"} Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.150582 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"603d469a-39a2-4d84-87cb-f2c7499b7a28","Type":"ContainerStarted","Data":"06ecae0f130331b9c70dbb4604848fad60c6fe33be915c08a2e497633d78988f"} Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.156929 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3eee92de-9c0e-4afd-8a27-52d82caa27ad","Type":"ContainerStarted","Data":"f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba"} Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.157197 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.226515 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.340207447 podStartE2EDuration="34.226477985s" podCreationTimestamp="2025-12-16 07:16:29 +0000 UTC" firstStartedPulling="2025-12-16 07:16:53.417851155 +0000 UTC m=+1291.906417278" lastFinishedPulling="2025-12-16 07:17:02.304121693 +0000 UTC m=+1300.792687816" observedRunningTime="2025-12-16 07:17:03.222719467 +0000 UTC m=+1301.711285590" watchObservedRunningTime="2025-12-16 07:17:03.226477985 +0000 UTC m=+1301.715044108" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.281250 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=23.207022289 podStartE2EDuration="31.281233728s" podCreationTimestamp="2025-12-16 07:16:32 +0000 UTC" firstStartedPulling="2025-12-16 07:16:54.217719043 +0000 UTC m=+1292.706285176" lastFinishedPulling="2025-12-16 07:17:02.291930492 +0000 UTC m=+1300.780496615" observedRunningTime="2025-12-16 07:17:03.256866186 +0000 UTC m=+1301.745432329" watchObservedRunningTime="2025-12-16 07:17:03.281233728 +0000 UTC m=+1301.769799851" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.377147 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.399502 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.019290868 podStartE2EDuration="40.39947603s" podCreationTimestamp="2025-12-16 07:16:23 +0000 UTC" firstStartedPulling="2025-12-16 07:16:24.913551166 +0000 UTC m=+1263.402117289" lastFinishedPulling="2025-12-16 07:17:02.293736328 +0000 UTC m=+1300.782302451" observedRunningTime="2025-12-16 07:17:03.286748481 +0000 UTC m=+1301.775314614" watchObservedRunningTime="2025-12-16 07:17:03.39947603 +0000 UTC m=+1301.888042153" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.443753 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.753075 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.840506 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.871838 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.872144 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 07:17:03 crc kubenswrapper[4823]: I1216 07:17:03.923114 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.164606 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.200148 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.207567 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.354616 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.355572 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.374487 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-jqbcz"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.421755 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-ndqjh"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.423261 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.430291 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.437570 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-ndqjh"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.446190 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-956hc"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.447315 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.451168 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.513391 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-956hc"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.554293 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovs-rundir\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.554577 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-dns-svc\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.554712 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.554880 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.555123 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kgs\" (UniqueName: \"kubernetes.io/projected/45d886e8-673c-4192-9a4d-b507ecc835ac-kube-api-access-f5kgs\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.555296 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-combined-ca-bundle\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.555441 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovn-rundir\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.555552 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-config\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.555669 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-config\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.555774 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5pmm\" (UniqueName: \"kubernetes.io/projected/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-kube-api-access-b5pmm\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.602785 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-tl9kf"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.668008 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9fb9c"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.673914 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.679720 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.726911 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-combined-ca-bundle\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727006 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovn-rundir\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727071 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-config\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727112 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-config\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727160 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5pmm\" (UniqueName: \"kubernetes.io/projected/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-kube-api-access-b5pmm\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727225 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovs-rundir\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727276 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-dns-svc\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727372 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.727410 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.728995 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovn-rundir\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.729869 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-config\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.728485 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kgs\" (UniqueName: \"kubernetes.io/projected/45d886e8-673c-4192-9a4d-b507ecc835ac-kube-api-access-f5kgs\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.732190 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.739957 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovs-rundir\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.740630 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-dns-svc\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.741557 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-config\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.747628 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.748182 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-combined-ca-bundle\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.771543 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kgs\" (UniqueName: \"kubernetes.io/projected/45d886e8-673c-4192-9a4d-b507ecc835ac-kube-api-access-f5kgs\") pod \"dnsmasq-dns-5b79764b65-ndqjh\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.782972 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.792085 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.793532 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9fb9c"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.794832 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.796485 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.797090 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.797213 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-97m6r" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.799478 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.806641 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5pmm\" (UniqueName: \"kubernetes.io/projected/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-kube-api-access-b5pmm\") pod \"ovn-controller-metrics-956hc\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.809796 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.814088 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.851517 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57nq\" (UniqueName: \"kubernetes.io/projected/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-kube-api-access-p57nq\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.851594 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.851658 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-config\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.851719 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-dns-svc\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.851882 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.929488 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.954814 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955064 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-config\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955107 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955126 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-scripts\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955144 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-dns-svc\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955185 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955201 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7694w\" (UniqueName: \"kubernetes.io/projected/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-kube-api-access-7694w\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955219 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-config\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955241 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955267 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955303 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57nq\" (UniqueName: \"kubernetes.io/projected/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-kube-api-access-p57nq\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.955324 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.956246 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.956993 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.957297 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-dns-svc\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.957339 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-config\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:04 crc kubenswrapper[4823]: I1216 07:17:04.982980 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57nq\" (UniqueName: \"kubernetes.io/projected/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-kube-api-access-p57nq\") pod \"dnsmasq-dns-586b989cdc-9fb9c\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.020178 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056210 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-dns-svc\") pod \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056282 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-config\") pod \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056360 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mvz\" (UniqueName: \"kubernetes.io/projected/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-kube-api-access-w4mvz\") pod \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\" (UID: \"408c33cd-064f-42e1-b3b5-a2c1b7046f0c\") " Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056836 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056875 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-scripts\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056935 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7694w\" (UniqueName: \"kubernetes.io/projected/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-kube-api-access-7694w\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056962 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-config\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.056988 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.057044 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.057093 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.057465 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-config" (OuterVolumeSpecName: "config") pod "408c33cd-064f-42e1-b3b5-a2c1b7046f0c" (UID: "408c33cd-064f-42e1-b3b5-a2c1b7046f0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.057647 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.057899 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "408c33cd-064f-42e1-b3b5-a2c1b7046f0c" (UID: "408c33cd-064f-42e1-b3b5-a2c1b7046f0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.066843 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-scripts\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.067278 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-kube-api-access-w4mvz" (OuterVolumeSpecName: "kube-api-access-w4mvz") pod "408c33cd-064f-42e1-b3b5-a2c1b7046f0c" (UID: "408c33cd-064f-42e1-b3b5-a2c1b7046f0c"). InnerVolumeSpecName "kube-api-access-w4mvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.067968 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.069240 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-config\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.073688 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.073972 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.079789 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.080283 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7694w\" (UniqueName: \"kubernetes.io/projected/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-kube-api-access-7694w\") pod \"ovn-northd-0\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.157773 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-config\") pod \"ff169a34-dc27-40ca-86ca-ba5e8f644502\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.158217 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnt5f\" (UniqueName: \"kubernetes.io/projected/ff169a34-dc27-40ca-86ca-ba5e8f644502-kube-api-access-bnt5f\") pod \"ff169a34-dc27-40ca-86ca-ba5e8f644502\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.158254 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-dns-svc\") pod \"ff169a34-dc27-40ca-86ca-ba5e8f644502\" (UID: \"ff169a34-dc27-40ca-86ca-ba5e8f644502\") " Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.158602 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.158616 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mvz\" (UniqueName: \"kubernetes.io/projected/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-kube-api-access-w4mvz\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.158628 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/408c33cd-064f-42e1-b3b5-a2c1b7046f0c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.159051 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff169a34-dc27-40ca-86ca-ba5e8f644502" (UID: "ff169a34-dc27-40ca-86ca-ba5e8f644502"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.159487 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-config" (OuterVolumeSpecName: "config") pod "ff169a34-dc27-40ca-86ca-ba5e8f644502" (UID: "ff169a34-dc27-40ca-86ca-ba5e8f644502"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.162638 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff169a34-dc27-40ca-86ca-ba5e8f644502-kube-api-access-bnt5f" (OuterVolumeSpecName: "kube-api-access-bnt5f") pod "ff169a34-dc27-40ca-86ca-ba5e8f644502" (UID: "ff169a34-dc27-40ca-86ca-ba5e8f644502"). InnerVolumeSpecName "kube-api-access-bnt5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.176511 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" event={"ID":"ff169a34-dc27-40ca-86ca-ba5e8f644502","Type":"ContainerDied","Data":"14ebaf545ed1186ba91c1abc6be3f10edce3ad3e9b902760e07af3db9b204a67"} Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.176530 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-tl9kf" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.181916 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" event={"ID":"408c33cd-064f-42e1-b3b5-a2c1b7046f0c","Type":"ContainerDied","Data":"12251f8e6a6b3eee048fdd603b30610930747609dafadcc3b005264ede9a2e8d"} Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.182137 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-jqbcz" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.222340 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.260421 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnt5f\" (UniqueName: \"kubernetes.io/projected/ff169a34-dc27-40ca-86ca-ba5e8f644502-kube-api-access-bnt5f\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.260467 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.260480 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff169a34-dc27-40ca-86ca-ba5e8f644502-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.265953 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-tl9kf"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.277013 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-tl9kf"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.294935 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-jqbcz"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.300568 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-jqbcz"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.406381 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-ndqjh"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.418904 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-956hc"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.492531 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.580477 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9fb9c"] Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.784237 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408c33cd-064f-42e1-b3b5-a2c1b7046f0c" path="/var/lib/kubelet/pods/408c33cd-064f-42e1-b3b5-a2c1b7046f0c/volumes" Dec 16 07:17:05 crc kubenswrapper[4823]: I1216 07:17:05.784975 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff169a34-dc27-40ca-86ca-ba5e8f644502" path="/var/lib/kubelet/pods/ff169a34-dc27-40ca-86ca-ba5e8f644502/volumes" Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.195496 4823 generic.go:334] "Generic (PLEG): container finished" podID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerID="8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1" exitCode=0 Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.195558 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" event={"ID":"45d886e8-673c-4192-9a4d-b507ecc835ac","Type":"ContainerDied","Data":"8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1"} Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.195620 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" event={"ID":"45d886e8-673c-4192-9a4d-b507ecc835ac","Type":"ContainerStarted","Data":"0b90e91e1383123dd55fcce76682499e807e568da36ffb04363d98b86790dea4"} Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.199691 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-956hc" event={"ID":"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6","Type":"ContainerStarted","Data":"fd126ddb078fca0b47691ccb775b5689a84f7d7d50e7281488f17b418ac9e03a"} Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.199722 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-956hc" event={"ID":"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6","Type":"ContainerStarted","Data":"ac1271e00df71a05fa4a2e68ea677a67464683b6651208b8b270e51f046cd15b"} Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.201361 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cfd02f05-0804-48c6-b9b4-cda88fd6b14a","Type":"ContainerStarted","Data":"251a58e423c3606cf72245339a9084a59e6134e6c468b74fb650a57e0f9ca8e9"} Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.205393 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" event={"ID":"9f3f8b57-525e-4d99-95c5-a4f41b0329c3","Type":"ContainerStarted","Data":"a7d9ba5c1a561ac6ed74b0ca1bc1c5fb58bf72ea5baa8dba7d19c527c0b95ae9"} Dec 16 07:17:06 crc kubenswrapper[4823]: I1216 07:17:06.237912 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-956hc" podStartSLOduration=2.237890508 podStartE2EDuration="2.237890508s" podCreationTimestamp="2025-12-16 07:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:06.233700177 +0000 UTC m=+1304.722266320" watchObservedRunningTime="2025-12-16 07:17:06.237890508 +0000 UTC m=+1304.726456651" Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.216424 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" event={"ID":"45d886e8-673c-4192-9a4d-b507ecc835ac","Type":"ContainerStarted","Data":"38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60"} Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.216876 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.219778 4823 generic.go:334] "Generic (PLEG): container finished" podID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerID="05a10d39894185589e65377738a6d7c93d73da48bd92ab655c9d06377116e944" exitCode=0 Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.219880 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" event={"ID":"9f3f8b57-525e-4d99-95c5-a4f41b0329c3","Type":"ContainerDied","Data":"05a10d39894185589e65377738a6d7c93d73da48bd92ab655c9d06377116e944"} Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.222687 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0","Type":"ContainerStarted","Data":"84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff"} Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.223269 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.250052 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" podStartSLOduration=2.795128782 podStartE2EDuration="3.25001245s" podCreationTimestamp="2025-12-16 07:17:04 +0000 UTC" firstStartedPulling="2025-12-16 07:17:05.422425023 +0000 UTC m=+1303.910991146" lastFinishedPulling="2025-12-16 07:17:05.877308691 +0000 UTC m=+1304.365874814" observedRunningTime="2025-12-16 07:17:07.247656116 +0000 UTC m=+1305.736222269" watchObservedRunningTime="2025-12-16 07:17:07.25001245 +0000 UTC m=+1305.738578583" Dec 16 07:17:07 crc kubenswrapper[4823]: I1216 07:17:07.266748 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.973031925 podStartE2EDuration="42.266717583s" podCreationTimestamp="2025-12-16 07:16:25 +0000 UTC" firstStartedPulling="2025-12-16 07:16:26.85412218 +0000 UTC m=+1265.342688303" lastFinishedPulling="2025-12-16 07:17:06.147807838 +0000 UTC m=+1304.636373961" observedRunningTime="2025-12-16 07:17:07.262902743 +0000 UTC m=+1305.751468866" watchObservedRunningTime="2025-12-16 07:17:07.266717583 +0000 UTC m=+1305.755283706" Dec 16 07:17:08 crc kubenswrapper[4823]: I1216 07:17:08.747473 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 07:17:08 crc kubenswrapper[4823]: I1216 07:17:08.824169 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.200116 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.238258 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" event={"ID":"9f3f8b57-525e-4d99-95c5-a4f41b0329c3","Type":"ContainerStarted","Data":"17cc1e8fd8348b3bf8e06dcbe2d5aceb8cad9309336edad706be0035289e384a"} Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.238430 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.240326 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cfd02f05-0804-48c6-b9b4-cda88fd6b14a","Type":"ContainerStarted","Data":"5f969b423030012c6374edf5d132a7aa122d3b273687a37f08e1b4c115ee2b6a"} Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.240387 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cfd02f05-0804-48c6-b9b4-cda88fd6b14a","Type":"ContainerStarted","Data":"c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399"} Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.240514 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.271251 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" podStartSLOduration=4.6483022179999995 podStartE2EDuration="5.271222476s" podCreationTimestamp="2025-12-16 07:17:04 +0000 UTC" firstStartedPulling="2025-12-16 07:17:05.581465011 +0000 UTC m=+1304.070031134" lastFinishedPulling="2025-12-16 07:17:06.204385269 +0000 UTC m=+1304.692951392" observedRunningTime="2025-12-16 07:17:09.26113731 +0000 UTC m=+1307.749703443" watchObservedRunningTime="2025-12-16 07:17:09.271222476 +0000 UTC m=+1307.759788599" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.292337 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.282498403 podStartE2EDuration="5.292316106s" podCreationTimestamp="2025-12-16 07:17:04 +0000 UTC" firstStartedPulling="2025-12-16 07:17:05.497557264 +0000 UTC m=+1303.986123387" lastFinishedPulling="2025-12-16 07:17:08.507374967 +0000 UTC m=+1306.995941090" observedRunningTime="2025-12-16 07:17:09.286090021 +0000 UTC m=+1307.774656144" watchObservedRunningTime="2025-12-16 07:17:09.292316106 +0000 UTC m=+1307.780882229" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.501197 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7g2pq"] Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.502474 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.521077 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7g2pq"] Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.549892 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ec9f-account-create-update-c7mql"] Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.551146 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.560554 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.574881 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ec9f-account-create-update-c7mql"] Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.649986 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpxsm\" (UniqueName: \"kubernetes.io/projected/2dec2d9f-40dd-4b15-ab4f-529a346e7857-kube-api-access-xpxsm\") pod \"glance-db-create-7g2pq\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.650178 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g288x\" (UniqueName: \"kubernetes.io/projected/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-kube-api-access-g288x\") pod \"glance-ec9f-account-create-update-c7mql\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.650315 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dec2d9f-40dd-4b15-ab4f-529a346e7857-operator-scripts\") pod \"glance-db-create-7g2pq\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.650393 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-operator-scripts\") pod \"glance-ec9f-account-create-update-c7mql\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.751631 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-operator-scripts\") pod \"glance-ec9f-account-create-update-c7mql\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.751709 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpxsm\" (UniqueName: \"kubernetes.io/projected/2dec2d9f-40dd-4b15-ab4f-529a346e7857-kube-api-access-xpxsm\") pod \"glance-db-create-7g2pq\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.751743 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g288x\" (UniqueName: \"kubernetes.io/projected/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-kube-api-access-g288x\") pod \"glance-ec9f-account-create-update-c7mql\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.751795 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dec2d9f-40dd-4b15-ab4f-529a346e7857-operator-scripts\") pod \"glance-db-create-7g2pq\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.752448 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dec2d9f-40dd-4b15-ab4f-529a346e7857-operator-scripts\") pod \"glance-db-create-7g2pq\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.753080 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-operator-scripts\") pod \"glance-ec9f-account-create-update-c7mql\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.772390 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpxsm\" (UniqueName: \"kubernetes.io/projected/2dec2d9f-40dd-4b15-ab4f-529a346e7857-kube-api-access-xpxsm\") pod \"glance-db-create-7g2pq\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.784841 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g288x\" (UniqueName: \"kubernetes.io/projected/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-kube-api-access-g288x\") pod \"glance-ec9f-account-create-update-c7mql\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.875377 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:09 crc kubenswrapper[4823]: I1216 07:17:09.897707 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:10 crc kubenswrapper[4823]: I1216 07:17:10.217793 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7g2pq"] Dec 16 07:17:10 crc kubenswrapper[4823]: W1216 07:17:10.221366 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dec2d9f_40dd_4b15_ab4f_529a346e7857.slice/crio-b1e5907d9e9c8b472feace072049b931a88bd4e4ff2ae05b6f48540972b6cf6e WatchSource:0}: Error finding container b1e5907d9e9c8b472feace072049b931a88bd4e4ff2ae05b6f48540972b6cf6e: Status 404 returned error can't find the container with id b1e5907d9e9c8b472feace072049b931a88bd4e4ff2ae05b6f48540972b6cf6e Dec 16 07:17:10 crc kubenswrapper[4823]: I1216 07:17:10.252553 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7g2pq" event={"ID":"2dec2d9f-40dd-4b15-ab4f-529a346e7857","Type":"ContainerStarted","Data":"b1e5907d9e9c8b472feace072049b931a88bd4e4ff2ae05b6f48540972b6cf6e"} Dec 16 07:17:10 crc kubenswrapper[4823]: W1216 07:17:10.493863 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod880f4ab2_4f17_4edf_91f0_6b2fae15c9a9.slice/crio-97176c5b11cf0d6568484ab37c9b3f4fed62adb4e1d9122924e08be1f20b2267 WatchSource:0}: Error finding container 97176c5b11cf0d6568484ab37c9b3f4fed62adb4e1d9122924e08be1f20b2267: Status 404 returned error can't find the container with id 97176c5b11cf0d6568484ab37c9b3f4fed62adb4e1d9122924e08be1f20b2267 Dec 16 07:17:10 crc kubenswrapper[4823]: I1216 07:17:10.494094 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ec9f-account-create-update-c7mql"] Dec 16 07:17:11 crc kubenswrapper[4823]: I1216 07:17:11.261598 4823 generic.go:334] "Generic (PLEG): container finished" podID="880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" containerID="bd0b3284091f48885d6643e6ca71a2073437a5eb768a82f1e8e8406468a01cec" exitCode=0 Dec 16 07:17:11 crc kubenswrapper[4823]: I1216 07:17:11.261663 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ec9f-account-create-update-c7mql" event={"ID":"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9","Type":"ContainerDied","Data":"bd0b3284091f48885d6643e6ca71a2073437a5eb768a82f1e8e8406468a01cec"} Dec 16 07:17:11 crc kubenswrapper[4823]: I1216 07:17:11.261911 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ec9f-account-create-update-c7mql" event={"ID":"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9","Type":"ContainerStarted","Data":"97176c5b11cf0d6568484ab37c9b3f4fed62adb4e1d9122924e08be1f20b2267"} Dec 16 07:17:11 crc kubenswrapper[4823]: I1216 07:17:11.263827 4823 generic.go:334] "Generic (PLEG): container finished" podID="2dec2d9f-40dd-4b15-ab4f-529a346e7857" containerID="7629ae1e5e4c4f1f9ac4ef046163592794f31a30605927bd59270499037455b0" exitCode=0 Dec 16 07:17:11 crc kubenswrapper[4823]: I1216 07:17:11.263881 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7g2pq" event={"ID":"2dec2d9f-40dd-4b15-ab4f-529a346e7857","Type":"ContainerDied","Data":"7629ae1e5e4c4f1f9ac4ef046163592794f31a30605927bd59270499037455b0"} Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.613699 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.618525 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g288x\" (UniqueName: \"kubernetes.io/projected/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-kube-api-access-g288x\") pod \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.620159 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.629537 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-kube-api-access-g288x" (OuterVolumeSpecName: "kube-api-access-g288x") pod "880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" (UID: "880f4ab2-4f17-4edf-91f0-6b2fae15c9a9"). InnerVolumeSpecName "kube-api-access-g288x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.719713 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dec2d9f-40dd-4b15-ab4f-529a346e7857-operator-scripts\") pod \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.719818 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-operator-scripts\") pod \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\" (UID: \"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9\") " Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.719923 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpxsm\" (UniqueName: \"kubernetes.io/projected/2dec2d9f-40dd-4b15-ab4f-529a346e7857-kube-api-access-xpxsm\") pod \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\" (UID: \"2dec2d9f-40dd-4b15-ab4f-529a346e7857\") " Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.720214 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g288x\" (UniqueName: \"kubernetes.io/projected/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-kube-api-access-g288x\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.720578 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dec2d9f-40dd-4b15-ab4f-529a346e7857-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dec2d9f-40dd-4b15-ab4f-529a346e7857" (UID: "2dec2d9f-40dd-4b15-ab4f-529a346e7857"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.720620 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" (UID: "880f4ab2-4f17-4edf-91f0-6b2fae15c9a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.723593 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dec2d9f-40dd-4b15-ab4f-529a346e7857-kube-api-access-xpxsm" (OuterVolumeSpecName: "kube-api-access-xpxsm") pod "2dec2d9f-40dd-4b15-ab4f-529a346e7857" (UID: "2dec2d9f-40dd-4b15-ab4f-529a346e7857"). InnerVolumeSpecName "kube-api-access-xpxsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.821275 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.821316 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpxsm\" (UniqueName: \"kubernetes.io/projected/2dec2d9f-40dd-4b15-ab4f-529a346e7857-kube-api-access-xpxsm\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:12 crc kubenswrapper[4823]: I1216 07:17:12.821329 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dec2d9f-40dd-4b15-ab4f-529a346e7857-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.278863 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ec9f-account-create-update-c7mql" event={"ID":"880f4ab2-4f17-4edf-91f0-6b2fae15c9a9","Type":"ContainerDied","Data":"97176c5b11cf0d6568484ab37c9b3f4fed62adb4e1d9122924e08be1f20b2267"} Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.278913 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ec9f-account-create-update-c7mql" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.278915 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97176c5b11cf0d6568484ab37c9b3f4fed62adb4e1d9122924e08be1f20b2267" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.281330 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7g2pq" event={"ID":"2dec2d9f-40dd-4b15-ab4f-529a346e7857","Type":"ContainerDied","Data":"b1e5907d9e9c8b472feace072049b931a88bd4e4ff2ae05b6f48540972b6cf6e"} Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.281354 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e5907d9e9c8b472feace072049b931a88bd4e4ff2ae05b6f48540972b6cf6e" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.281363 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7g2pq" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.747728 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-76bqm"] Dec 16 07:17:13 crc kubenswrapper[4823]: E1216 07:17:13.748142 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" containerName="mariadb-account-create-update" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.748159 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" containerName="mariadb-account-create-update" Dec 16 07:17:13 crc kubenswrapper[4823]: E1216 07:17:13.748194 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dec2d9f-40dd-4b15-ab4f-529a346e7857" containerName="mariadb-database-create" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.748203 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dec2d9f-40dd-4b15-ab4f-529a346e7857" containerName="mariadb-database-create" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.748380 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" containerName="mariadb-account-create-update" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.748408 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dec2d9f-40dd-4b15-ab4f-529a346e7857" containerName="mariadb-database-create" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.748983 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.761918 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-76bqm"] Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.839872 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74961679-896e-4f16-a5c3-12708a20a4b1-operator-scripts\") pod \"keystone-db-create-76bqm\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.840229 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkc57\" (UniqueName: \"kubernetes.io/projected/74961679-896e-4f16-a5c3-12708a20a4b1-kube-api-access-gkc57\") pod \"keystone-db-create-76bqm\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.868382 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-df58-account-create-update-6mv8r"] Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.870522 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.873633 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.883898 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df58-account-create-update-6mv8r"] Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.942199 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkc57\" (UniqueName: \"kubernetes.io/projected/74961679-896e-4f16-a5c3-12708a20a4b1-kube-api-access-gkc57\") pod \"keystone-db-create-76bqm\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.942407 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5ee65e-affe-42fd-af62-724d11efe03d-operator-scripts\") pod \"keystone-df58-account-create-update-6mv8r\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.942439 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dkh\" (UniqueName: \"kubernetes.io/projected/8d5ee65e-affe-42fd-af62-724d11efe03d-kube-api-access-57dkh\") pod \"keystone-df58-account-create-update-6mv8r\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.942586 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74961679-896e-4f16-a5c3-12708a20a4b1-operator-scripts\") pod \"keystone-db-create-76bqm\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.943588 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74961679-896e-4f16-a5c3-12708a20a4b1-operator-scripts\") pod \"keystone-db-create-76bqm\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:13 crc kubenswrapper[4823]: I1216 07:17:13.963287 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkc57\" (UniqueName: \"kubernetes.io/projected/74961679-896e-4f16-a5c3-12708a20a4b1-kube-api-access-gkc57\") pod \"keystone-db-create-76bqm\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.043985 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5ee65e-affe-42fd-af62-724d11efe03d-operator-scripts\") pod \"keystone-df58-account-create-update-6mv8r\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.044044 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dkh\" (UniqueName: \"kubernetes.io/projected/8d5ee65e-affe-42fd-af62-724d11efe03d-kube-api-access-57dkh\") pod \"keystone-df58-account-create-update-6mv8r\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.044885 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5ee65e-affe-42fd-af62-724d11efe03d-operator-scripts\") pod \"keystone-df58-account-create-update-6mv8r\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.068002 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.071645 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dkh\" (UniqueName: \"kubernetes.io/projected/8d5ee65e-affe-42fd-af62-724d11efe03d-kube-api-access-57dkh\") pod \"keystone-df58-account-create-update-6mv8r\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.074480 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dxgzr"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.075729 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.083782 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dxgzr"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.146598 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa17caf-7d1d-4094-9334-453fe242229e-operator-scripts\") pod \"placement-db-create-dxgzr\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.146692 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmdv\" (UniqueName: \"kubernetes.io/projected/2aa17caf-7d1d-4094-9334-453fe242229e-kube-api-access-tmmdv\") pod \"placement-db-create-dxgzr\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.187782 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f3f4-account-create-update-tlhkf"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.189118 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.195367 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.195434 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.204803 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f3f4-account-create-update-tlhkf"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.255780 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgxwf\" (UniqueName: \"kubernetes.io/projected/d6ccf3be-a323-4df6-8c32-c646c4ced20f-kube-api-access-hgxwf\") pod \"placement-f3f4-account-create-update-tlhkf\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.255904 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccf3be-a323-4df6-8c32-c646c4ced20f-operator-scripts\") pod \"placement-f3f4-account-create-update-tlhkf\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.256051 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa17caf-7d1d-4094-9334-453fe242229e-operator-scripts\") pod \"placement-db-create-dxgzr\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.256128 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmmdv\" (UniqueName: \"kubernetes.io/projected/2aa17caf-7d1d-4094-9334-453fe242229e-kube-api-access-tmmdv\") pod \"placement-db-create-dxgzr\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.257665 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa17caf-7d1d-4094-9334-453fe242229e-operator-scripts\") pod \"placement-db-create-dxgzr\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.275431 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmmdv\" (UniqueName: \"kubernetes.io/projected/2aa17caf-7d1d-4094-9334-453fe242229e-kube-api-access-tmmdv\") pod \"placement-db-create-dxgzr\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.357299 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccf3be-a323-4df6-8c32-c646c4ced20f-operator-scripts\") pod \"placement-f3f4-account-create-update-tlhkf\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.357446 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgxwf\" (UniqueName: \"kubernetes.io/projected/d6ccf3be-a323-4df6-8c32-c646c4ced20f-kube-api-access-hgxwf\") pod \"placement-f3f4-account-create-update-tlhkf\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.358276 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccf3be-a323-4df6-8c32-c646c4ced20f-operator-scripts\") pod \"placement-f3f4-account-create-update-tlhkf\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.374647 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgxwf\" (UniqueName: \"kubernetes.io/projected/d6ccf3be-a323-4df6-8c32-c646c4ced20f-kube-api-access-hgxwf\") pod \"placement-f3f4-account-create-update-tlhkf\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.496981 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.521253 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.566879 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-76bqm"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.676151 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-df58-account-create-update-6mv8r"] Dec 16 07:17:14 crc kubenswrapper[4823]: W1216 07:17:14.678572 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d5ee65e_affe_42fd_af62_724d11efe03d.slice/crio-192957ee765098040110d62885a79fc3c630bfa7544af32fbe860f0a54881b55 WatchSource:0}: Error finding container 192957ee765098040110d62885a79fc3c630bfa7544af32fbe860f0a54881b55: Status 404 returned error can't find the container with id 192957ee765098040110d62885a79fc3c630bfa7544af32fbe860f0a54881b55 Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.801769 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.851639 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2mqx2"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.852564 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.857113 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.857218 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8pjhw" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.869581 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2mqx2"] Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.873071 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-db-sync-config-data\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.873185 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-combined-ca-bundle\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.873329 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l25df\" (UniqueName: \"kubernetes.io/projected/4506b142-a95e-4cf3-a000-56fbee5e024d-kube-api-access-l25df\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.873428 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-config-data\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.978049 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l25df\" (UniqueName: \"kubernetes.io/projected/4506b142-a95e-4cf3-a000-56fbee5e024d-kube-api-access-l25df\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.978139 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-config-data\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.978188 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-db-sync-config-data\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.978232 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-combined-ca-bundle\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.984374 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-db-sync-config-data\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.986399 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-combined-ca-bundle\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.988059 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-config-data\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:14 crc kubenswrapper[4823]: I1216 07:17:14.989746 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dxgzr"] Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.000636 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l25df\" (UniqueName: \"kubernetes.io/projected/4506b142-a95e-4cf3-a000-56fbee5e024d-kube-api-access-l25df\") pod \"glance-db-sync-2mqx2\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.022190 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.099256 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-ndqjh"] Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.112957 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f3f4-account-create-update-tlhkf"] Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.190612 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mqx2" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.305699 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f3f4-account-create-update-tlhkf" event={"ID":"d6ccf3be-a323-4df6-8c32-c646c4ced20f","Type":"ContainerStarted","Data":"0351077d1f1b19ebb11fd0c70508eb069f6ad75b01a78ade33e07ec7bd8a676a"} Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.310173 4823 generic.go:334] "Generic (PLEG): container finished" podID="8d5ee65e-affe-42fd-af62-724d11efe03d" containerID="f3d1465f72d95e81aa577c6e692c288ce8109410382e3c7ab46eaaa8aa34126d" exitCode=0 Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.310244 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df58-account-create-update-6mv8r" event={"ID":"8d5ee65e-affe-42fd-af62-724d11efe03d","Type":"ContainerDied","Data":"f3d1465f72d95e81aa577c6e692c288ce8109410382e3c7ab46eaaa8aa34126d"} Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.310269 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df58-account-create-update-6mv8r" event={"ID":"8d5ee65e-affe-42fd-af62-724d11efe03d","Type":"ContainerStarted","Data":"192957ee765098040110d62885a79fc3c630bfa7544af32fbe860f0a54881b55"} Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.311773 4823 generic.go:334] "Generic (PLEG): container finished" podID="74961679-896e-4f16-a5c3-12708a20a4b1" containerID="8b55431d53902ca53d22cd6d0b32b46fec767b6a6c22ef706d3cff33e889fb4b" exitCode=0 Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.311825 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-76bqm" event={"ID":"74961679-896e-4f16-a5c3-12708a20a4b1","Type":"ContainerDied","Data":"8b55431d53902ca53d22cd6d0b32b46fec767b6a6c22ef706d3cff33e889fb4b"} Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.311846 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-76bqm" event={"ID":"74961679-896e-4f16-a5c3-12708a20a4b1","Type":"ContainerStarted","Data":"79dca52db8cf73c9592e984c6480d4e97461946ede1eebbd8f3e78bc5b579a4a"} Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.329637 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dxgzr" event={"ID":"2aa17caf-7d1d-4094-9334-453fe242229e","Type":"ContainerStarted","Data":"5d9bcc2504fa53c56e883532e051415d71de97d009307590b10b97fd15a58fa3"} Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.329819 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerName="dnsmasq-dns" containerID="cri-o://38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60" gracePeriod=10 Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.727556 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.786466 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2mqx2"] Dec 16 07:17:15 crc kubenswrapper[4823]: W1216 07:17:15.788146 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4506b142_a95e_4cf3_a000_56fbee5e024d.slice/crio-f646c71e61816734c61e0d887295a31557a0b683fc094b983a72589fb6390a47 WatchSource:0}: Error finding container f646c71e61816734c61e0d887295a31557a0b683fc094b983a72589fb6390a47: Status 404 returned error can't find the container with id f646c71e61816734c61e0d887295a31557a0b683fc094b983a72589fb6390a47 Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.790823 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-config\") pod \"45d886e8-673c-4192-9a4d-b507ecc835ac\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.790895 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5kgs\" (UniqueName: \"kubernetes.io/projected/45d886e8-673c-4192-9a4d-b507ecc835ac-kube-api-access-f5kgs\") pod \"45d886e8-673c-4192-9a4d-b507ecc835ac\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.790940 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-dns-svc\") pod \"45d886e8-673c-4192-9a4d-b507ecc835ac\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.791094 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-ovsdbserver-sb\") pod \"45d886e8-673c-4192-9a4d-b507ecc835ac\" (UID: \"45d886e8-673c-4192-9a4d-b507ecc835ac\") " Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.799252 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d886e8-673c-4192-9a4d-b507ecc835ac-kube-api-access-f5kgs" (OuterVolumeSpecName: "kube-api-access-f5kgs") pod "45d886e8-673c-4192-9a4d-b507ecc835ac" (UID: "45d886e8-673c-4192-9a4d-b507ecc835ac"). InnerVolumeSpecName "kube-api-access-f5kgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.830541 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45d886e8-673c-4192-9a4d-b507ecc835ac" (UID: "45d886e8-673c-4192-9a4d-b507ecc835ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.831493 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-config" (OuterVolumeSpecName: "config") pod "45d886e8-673c-4192-9a4d-b507ecc835ac" (UID: "45d886e8-673c-4192-9a4d-b507ecc835ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.834383 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45d886e8-673c-4192-9a4d-b507ecc835ac" (UID: "45d886e8-673c-4192-9a4d-b507ecc835ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.892540 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.892571 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.892581 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5kgs\" (UniqueName: \"kubernetes.io/projected/45d886e8-673c-4192-9a4d-b507ecc835ac-kube-api-access-f5kgs\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:15 crc kubenswrapper[4823]: I1216 07:17:15.892591 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d886e8-673c-4192-9a4d-b507ecc835ac-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.188455 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.201927 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-x4fkj"] Dec 16 07:17:16 crc kubenswrapper[4823]: E1216 07:17:16.202315 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerName="dnsmasq-dns" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.202332 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerName="dnsmasq-dns" Dec 16 07:17:16 crc kubenswrapper[4823]: E1216 07:17:16.202352 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerName="init" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.202359 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerName="init" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.202492 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerName="dnsmasq-dns" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.203325 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.239489 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-x4fkj"] Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.299864 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-config\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.299917 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.299962 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2728\" (UniqueName: \"kubernetes.io/projected/5282b108-1519-455e-b112-ad707af48a9f-kube-api-access-q2728\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.299999 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.300055 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.339206 4823 generic.go:334] "Generic (PLEG): container finished" podID="d6ccf3be-a323-4df6-8c32-c646c4ced20f" containerID="beed455aa58d1c72e6c92727c12ca8890c4f785879024962ab40d5107da8e72c" exitCode=0 Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.339283 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f3f4-account-create-update-tlhkf" event={"ID":"d6ccf3be-a323-4df6-8c32-c646c4ced20f","Type":"ContainerDied","Data":"beed455aa58d1c72e6c92727c12ca8890c4f785879024962ab40d5107da8e72c"} Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.342935 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mqx2" event={"ID":"4506b142-a95e-4cf3-a000-56fbee5e024d","Type":"ContainerStarted","Data":"f646c71e61816734c61e0d887295a31557a0b683fc094b983a72589fb6390a47"} Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.344977 4823 generic.go:334] "Generic (PLEG): container finished" podID="45d886e8-673c-4192-9a4d-b507ecc835ac" containerID="38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60" exitCode=0 Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.345056 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" event={"ID":"45d886e8-673c-4192-9a4d-b507ecc835ac","Type":"ContainerDied","Data":"38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60"} Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.345082 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" event={"ID":"45d886e8-673c-4192-9a4d-b507ecc835ac","Type":"ContainerDied","Data":"0b90e91e1383123dd55fcce76682499e807e568da36ffb04363d98b86790dea4"} Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.345102 4823 scope.go:117] "RemoveContainer" containerID="38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.345248 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-ndqjh" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.352035 4823 generic.go:334] "Generic (PLEG): container finished" podID="2aa17caf-7d1d-4094-9334-453fe242229e" containerID="8be0dc60e3282098ffe9292c57b0b05f6e0084ca7fc9c9c39988f91573dc30f2" exitCode=0 Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.352841 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dxgzr" event={"ID":"2aa17caf-7d1d-4094-9334-453fe242229e","Type":"ContainerDied","Data":"8be0dc60e3282098ffe9292c57b0b05f6e0084ca7fc9c9c39988f91573dc30f2"} Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.390595 4823 scope.go:117] "RemoveContainer" containerID="8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.415806 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.416096 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.416473 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-config\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.416500 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.416538 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2728\" (UniqueName: \"kubernetes.io/projected/5282b108-1519-455e-b112-ad707af48a9f-kube-api-access-q2728\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.418289 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.418726 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.418987 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.419705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-config\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.427578 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-ndqjh"] Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.437048 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2728\" (UniqueName: \"kubernetes.io/projected/5282b108-1519-455e-b112-ad707af48a9f-kube-api-access-q2728\") pod \"dnsmasq-dns-67fdf7998c-x4fkj\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.440744 4823 scope.go:117] "RemoveContainer" containerID="38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60" Dec 16 07:17:16 crc kubenswrapper[4823]: E1216 07:17:16.441405 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60\": container with ID starting with 38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60 not found: ID does not exist" containerID="38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.441435 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60"} err="failed to get container status \"38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60\": rpc error: code = NotFound desc = could not find container \"38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60\": container with ID starting with 38c912911689fa5c46ae9dc7c4d1e631a6d378a2f816aa0fa0aef3ad6b7a2a60 not found: ID does not exist" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.441457 4823 scope.go:117] "RemoveContainer" containerID="8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1" Dec 16 07:17:16 crc kubenswrapper[4823]: E1216 07:17:16.441692 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1\": container with ID starting with 8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1 not found: ID does not exist" containerID="8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.441712 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1"} err="failed to get container status \"8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1\": rpc error: code = NotFound desc = could not find container \"8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1\": container with ID starting with 8a50792e79ac6e5fa1a1c7a8db9d24acbefecbadcd96ed8e500753c742da39e1 not found: ID does not exist" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.446379 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-ndqjh"] Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.523180 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.669615 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.724835 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5ee65e-affe-42fd-af62-724d11efe03d-operator-scripts\") pod \"8d5ee65e-affe-42fd-af62-724d11efe03d\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.727482 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d5ee65e-affe-42fd-af62-724d11efe03d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d5ee65e-affe-42fd-af62-724d11efe03d" (UID: "8d5ee65e-affe-42fd-af62-724d11efe03d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.742235 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.827370 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dkh\" (UniqueName: \"kubernetes.io/projected/8d5ee65e-affe-42fd-af62-724d11efe03d-kube-api-access-57dkh\") pod \"8d5ee65e-affe-42fd-af62-724d11efe03d\" (UID: \"8d5ee65e-affe-42fd-af62-724d11efe03d\") " Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.827476 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74961679-896e-4f16-a5c3-12708a20a4b1-operator-scripts\") pod \"74961679-896e-4f16-a5c3-12708a20a4b1\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.827632 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkc57\" (UniqueName: \"kubernetes.io/projected/74961679-896e-4f16-a5c3-12708a20a4b1-kube-api-access-gkc57\") pod \"74961679-896e-4f16-a5c3-12708a20a4b1\" (UID: \"74961679-896e-4f16-a5c3-12708a20a4b1\") " Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.827937 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d5ee65e-affe-42fd-af62-724d11efe03d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.828076 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74961679-896e-4f16-a5c3-12708a20a4b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74961679-896e-4f16-a5c3-12708a20a4b1" (UID: "74961679-896e-4f16-a5c3-12708a20a4b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.832286 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74961679-896e-4f16-a5c3-12708a20a4b1-kube-api-access-gkc57" (OuterVolumeSpecName: "kube-api-access-gkc57") pod "74961679-896e-4f16-a5c3-12708a20a4b1" (UID: "74961679-896e-4f16-a5c3-12708a20a4b1"). InnerVolumeSpecName "kube-api-access-gkc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.832395 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5ee65e-affe-42fd-af62-724d11efe03d-kube-api-access-57dkh" (OuterVolumeSpecName: "kube-api-access-57dkh") pod "8d5ee65e-affe-42fd-af62-724d11efe03d" (UID: "8d5ee65e-affe-42fd-af62-724d11efe03d"). InnerVolumeSpecName "kube-api-access-57dkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.928790 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dkh\" (UniqueName: \"kubernetes.io/projected/8d5ee65e-affe-42fd-af62-724d11efe03d-kube-api-access-57dkh\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.928823 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74961679-896e-4f16-a5c3-12708a20a4b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:16 crc kubenswrapper[4823]: I1216 07:17:16.928832 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkc57\" (UniqueName: \"kubernetes.io/projected/74961679-896e-4f16-a5c3-12708a20a4b1-kube-api-access-gkc57\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.029487 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-x4fkj"] Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.242550 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:17:17 crc kubenswrapper[4823]: E1216 07:17:17.243136 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5ee65e-affe-42fd-af62-724d11efe03d" containerName="mariadb-account-create-update" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.243154 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5ee65e-affe-42fd-af62-724d11efe03d" containerName="mariadb-account-create-update" Dec 16 07:17:17 crc kubenswrapper[4823]: E1216 07:17:17.243178 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74961679-896e-4f16-a5c3-12708a20a4b1" containerName="mariadb-database-create" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.243185 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="74961679-896e-4f16-a5c3-12708a20a4b1" containerName="mariadb-database-create" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.243348 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5ee65e-affe-42fd-af62-724d11efe03d" containerName="mariadb-account-create-update" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.243363 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="74961679-896e-4f16-a5c3-12708a20a4b1" containerName="mariadb-database-create" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.248625 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.253702 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.253816 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.253816 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-h2sbp" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.264107 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.266430 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.364356 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" event={"ID":"5282b108-1519-455e-b112-ad707af48a9f","Type":"ContainerStarted","Data":"f709cd387b90c849459cf29f098990224eee24c85f315471bbe11e4fbdd027e8"} Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.366663 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-df58-account-create-update-6mv8r" event={"ID":"8d5ee65e-affe-42fd-af62-724d11efe03d","Type":"ContainerDied","Data":"192957ee765098040110d62885a79fc3c630bfa7544af32fbe860f0a54881b55"} Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.366715 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="192957ee765098040110d62885a79fc3c630bfa7544af32fbe860f0a54881b55" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.366778 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-df58-account-create-update-6mv8r" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.369066 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-76bqm" event={"ID":"74961679-896e-4f16-a5c3-12708a20a4b1","Type":"ContainerDied","Data":"79dca52db8cf73c9592e984c6480d4e97461946ede1eebbd8f3e78bc5b579a4a"} Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.369110 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79dca52db8cf73c9592e984c6480d4e97461946ede1eebbd8f3e78bc5b579a4a" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.369177 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-76bqm" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.436975 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-cache\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.437031 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvpn\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-kube-api-access-5gvpn\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.437067 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.437091 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.437135 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-lock\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.539065 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-lock\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.539168 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-cache\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.539192 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvpn\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-kube-api-access-5gvpn\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.539227 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.539252 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: E1216 07:17:17.539481 4823 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:17:17 crc kubenswrapper[4823]: E1216 07:17:17.539505 4823 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:17:17 crc kubenswrapper[4823]: E1216 07:17:17.539559 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift podName:37eade87-02f6-4584-87d3-9b22e16ad915 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:18.039538349 +0000 UTC m=+1316.528104472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift") pod "swift-storage-0" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915") : configmap "swift-ring-files" not found Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.539572 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.540074 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-lock\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.540142 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-cache\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.561364 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvpn\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-kube-api-access-5gvpn\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.563224 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.719642 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.748314 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgxwf\" (UniqueName: \"kubernetes.io/projected/d6ccf3be-a323-4df6-8c32-c646c4ced20f-kube-api-access-hgxwf\") pod \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.748581 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccf3be-a323-4df6-8c32-c646c4ced20f-operator-scripts\") pod \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\" (UID: \"d6ccf3be-a323-4df6-8c32-c646c4ced20f\") " Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.750907 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ccf3be-a323-4df6-8c32-c646c4ced20f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6ccf3be-a323-4df6-8c32-c646c4ced20f" (UID: "d6ccf3be-a323-4df6-8c32-c646c4ced20f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.756440 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ccf3be-a323-4df6-8c32-c646c4ced20f-kube-api-access-hgxwf" (OuterVolumeSpecName: "kube-api-access-hgxwf") pod "d6ccf3be-a323-4df6-8c32-c646c4ced20f" (UID: "d6ccf3be-a323-4df6-8c32-c646c4ced20f"). InnerVolumeSpecName "kube-api-access-hgxwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.793808 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d886e8-673c-4192-9a4d-b507ecc835ac" path="/var/lib/kubelet/pods/45d886e8-673c-4192-9a4d-b507ecc835ac/volumes" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.800577 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.851202 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmmdv\" (UniqueName: \"kubernetes.io/projected/2aa17caf-7d1d-4094-9334-453fe242229e-kube-api-access-tmmdv\") pod \"2aa17caf-7d1d-4094-9334-453fe242229e\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.851608 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa17caf-7d1d-4094-9334-453fe242229e-operator-scripts\") pod \"2aa17caf-7d1d-4094-9334-453fe242229e\" (UID: \"2aa17caf-7d1d-4094-9334-453fe242229e\") " Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.852158 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa17caf-7d1d-4094-9334-453fe242229e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2aa17caf-7d1d-4094-9334-453fe242229e" (UID: "2aa17caf-7d1d-4094-9334-453fe242229e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.852447 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ccf3be-a323-4df6-8c32-c646c4ced20f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.852469 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgxwf\" (UniqueName: \"kubernetes.io/projected/d6ccf3be-a323-4df6-8c32-c646c4ced20f-kube-api-access-hgxwf\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.852481 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa17caf-7d1d-4094-9334-453fe242229e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.854635 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa17caf-7d1d-4094-9334-453fe242229e-kube-api-access-tmmdv" (OuterVolumeSpecName: "kube-api-access-tmmdv") pod "2aa17caf-7d1d-4094-9334-453fe242229e" (UID: "2aa17caf-7d1d-4094-9334-453fe242229e"). InnerVolumeSpecName "kube-api-access-tmmdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:17 crc kubenswrapper[4823]: I1216 07:17:17.954846 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmmdv\" (UniqueName: \"kubernetes.io/projected/2aa17caf-7d1d-4094-9334-453fe242229e-kube-api-access-tmmdv\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.055928 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:18 crc kubenswrapper[4823]: E1216 07:17:18.056173 4823 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:17:18 crc kubenswrapper[4823]: E1216 07:17:18.056192 4823 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:17:18 crc kubenswrapper[4823]: E1216 07:17:18.056239 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift podName:37eade87-02f6-4584-87d3-9b22e16ad915 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:19.056222973 +0000 UTC m=+1317.544789096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift") pod "swift-storage-0" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915") : configmap "swift-ring-files" not found Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.397767 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f3f4-account-create-update-tlhkf" event={"ID":"d6ccf3be-a323-4df6-8c32-c646c4ced20f","Type":"ContainerDied","Data":"0351077d1f1b19ebb11fd0c70508eb069f6ad75b01a78ade33e07ec7bd8a676a"} Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.397808 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0351077d1f1b19ebb11fd0c70508eb069f6ad75b01a78ade33e07ec7bd8a676a" Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.397871 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f3f4-account-create-update-tlhkf" Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.419667 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dxgzr" event={"ID":"2aa17caf-7d1d-4094-9334-453fe242229e","Type":"ContainerDied","Data":"5d9bcc2504fa53c56e883532e051415d71de97d009307590b10b97fd15a58fa3"} Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.419715 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9bcc2504fa53c56e883532e051415d71de97d009307590b10b97fd15a58fa3" Dec 16 07:17:18 crc kubenswrapper[4823]: I1216 07:17:18.419787 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dxgzr" Dec 16 07:17:19 crc kubenswrapper[4823]: I1216 07:17:19.072694 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:19 crc kubenswrapper[4823]: E1216 07:17:19.072992 4823 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:17:19 crc kubenswrapper[4823]: E1216 07:17:19.073214 4823 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:17:19 crc kubenswrapper[4823]: E1216 07:17:19.073299 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift podName:37eade87-02f6-4584-87d3-9b22e16ad915 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:21.073274498 +0000 UTC m=+1319.561840661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift") pod "swift-storage-0" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915") : configmap "swift-ring-files" not found Dec 16 07:17:19 crc kubenswrapper[4823]: I1216 07:17:19.430226 4823 generic.go:334] "Generic (PLEG): container finished" podID="5282b108-1519-455e-b112-ad707af48a9f" containerID="6ee1a67603123534c21fbda13fcc80a4d47b24f3d7820be62d68dacfc183a448" exitCode=0 Dec 16 07:17:19 crc kubenswrapper[4823]: I1216 07:17:19.430283 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" event={"ID":"5282b108-1519-455e-b112-ad707af48a9f","Type":"ContainerDied","Data":"6ee1a67603123534c21fbda13fcc80a4d47b24f3d7820be62d68dacfc183a448"} Dec 16 07:17:20 crc kubenswrapper[4823]: I1216 07:17:20.293967 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 07:17:20 crc kubenswrapper[4823]: I1216 07:17:20.440382 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" event={"ID":"5282b108-1519-455e-b112-ad707af48a9f","Type":"ContainerStarted","Data":"7b7c792a68d4e76b92c312443198e54881b3f7fc18a5ddf4d23f57980f79af89"} Dec 16 07:17:20 crc kubenswrapper[4823]: I1216 07:17:20.440486 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:20 crc kubenswrapper[4823]: I1216 07:17:20.468832 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" podStartSLOduration=4.468812202 podStartE2EDuration="4.468812202s" podCreationTimestamp="2025-12-16 07:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:20.461620247 +0000 UTC m=+1318.950186370" watchObservedRunningTime="2025-12-16 07:17:20.468812202 +0000 UTC m=+1318.957378325" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.112433 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:21 crc kubenswrapper[4823]: E1216 07:17:21.112639 4823 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:17:21 crc kubenswrapper[4823]: E1216 07:17:21.112847 4823 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:17:21 crc kubenswrapper[4823]: E1216 07:17:21.112893 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift podName:37eade87-02f6-4584-87d3-9b22e16ad915 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:25.112879913 +0000 UTC m=+1323.601446036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift") pod "swift-storage-0" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915") : configmap "swift-ring-files" not found Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.260243 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nhgg2"] Dec 16 07:17:21 crc kubenswrapper[4823]: E1216 07:17:21.260653 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa17caf-7d1d-4094-9334-453fe242229e" containerName="mariadb-database-create" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.260675 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa17caf-7d1d-4094-9334-453fe242229e" containerName="mariadb-database-create" Dec 16 07:17:21 crc kubenswrapper[4823]: E1216 07:17:21.260724 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ccf3be-a323-4df6-8c32-c646c4ced20f" containerName="mariadb-account-create-update" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.260733 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ccf3be-a323-4df6-8c32-c646c4ced20f" containerName="mariadb-account-create-update" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.260914 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ccf3be-a323-4df6-8c32-c646c4ced20f" containerName="mariadb-account-create-update" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.260948 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa17caf-7d1d-4094-9334-453fe242229e" containerName="mariadb-database-create" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.261573 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.267103 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.267431 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.267680 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.278548 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nhgg2"] Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318588 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-swiftconf\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318637 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e02e173-17cf-486b-9c4a-b68aa6879f97-etc-swift\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318675 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-dispersionconf\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318716 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-combined-ca-bundle\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318757 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-ring-data-devices\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318777 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-scripts\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.318815 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjhj7\" (UniqueName: \"kubernetes.io/projected/5e02e173-17cf-486b-9c4a-b68aa6879f97-kube-api-access-jjhj7\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420547 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-dispersionconf\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420642 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-combined-ca-bundle\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420702 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-ring-data-devices\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420736 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-scripts\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420794 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjhj7\" (UniqueName: \"kubernetes.io/projected/5e02e173-17cf-486b-9c4a-b68aa6879f97-kube-api-access-jjhj7\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420865 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-swiftconf\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.420900 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e02e173-17cf-486b-9c4a-b68aa6879f97-etc-swift\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.422044 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e02e173-17cf-486b-9c4a-b68aa6879f97-etc-swift\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.422544 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-scripts\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.423820 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-ring-data-devices\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.427905 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-swiftconf\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.439471 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjhj7\" (UniqueName: \"kubernetes.io/projected/5e02e173-17cf-486b-9c4a-b68aa6879f97-kube-api-access-jjhj7\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.440347 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-combined-ca-bundle\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.453278 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-dispersionconf\") pod \"swift-ring-rebalance-nhgg2\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:21 crc kubenswrapper[4823]: I1216 07:17:21.587928 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:22 crc kubenswrapper[4823]: I1216 07:17:22.064136 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nhgg2"] Dec 16 07:17:25 crc kubenswrapper[4823]: I1216 07:17:25.203169 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:25 crc kubenswrapper[4823]: E1216 07:17:25.203916 4823 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:17:25 crc kubenswrapper[4823]: E1216 07:17:25.203936 4823 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:17:25 crc kubenswrapper[4823]: E1216 07:17:25.203983 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift podName:37eade87-02f6-4584-87d3-9b22e16ad915 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:33.203967442 +0000 UTC m=+1331.692533565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift") pod "swift-storage-0" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915") : configmap "swift-ring-files" not found Dec 16 07:17:26 crc kubenswrapper[4823]: I1216 07:17:26.525268 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:17:26 crc kubenswrapper[4823]: I1216 07:17:26.640211 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9fb9c"] Dec 16 07:17:26 crc kubenswrapper[4823]: I1216 07:17:26.640976 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="dnsmasq-dns" containerID="cri-o://17cc1e8fd8348b3bf8e06dcbe2d5aceb8cad9309336edad706be0035289e384a" gracePeriod=10 Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.133828 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.134156 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.134201 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.134901 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76342a6438b46c6d8e5101ee8ceb1df808db353230663e448e28ebb26272e882"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.134960 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://76342a6438b46c6d8e5101ee8ceb1df808db353230663e448e28ebb26272e882" gracePeriod=600 Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.537142 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="76342a6438b46c6d8e5101ee8ceb1df808db353230663e448e28ebb26272e882" exitCode=0 Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.537214 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"76342a6438b46c6d8e5101ee8ceb1df808db353230663e448e28ebb26272e882"} Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.537275 4823 scope.go:117] "RemoveContainer" containerID="c07a7c4faebf9ec795cca9e8449add482643e386f41ece163e5f5944f0d37df3" Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.539890 4823 generic.go:334] "Generic (PLEG): container finished" podID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerID="17cc1e8fd8348b3bf8e06dcbe2d5aceb8cad9309336edad706be0035289e384a" exitCode=0 Dec 16 07:17:28 crc kubenswrapper[4823]: I1216 07:17:28.539931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" event={"ID":"9f3f8b57-525e-4d99-95c5-a4f41b0329c3","Type":"ContainerDied","Data":"17cc1e8fd8348b3bf8e06dcbe2d5aceb8cad9309336edad706be0035289e384a"} Dec 16 07:17:29 crc kubenswrapper[4823]: I1216 07:17:29.982486 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fvqqp" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" probeResult="failure" output=< Dec 16 07:17:29 crc kubenswrapper[4823]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 07:17:29 crc kubenswrapper[4823]: > Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.009375 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.010733 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.021143 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.446108 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fvqqp-config-psgj5"] Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.447587 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvqqp-config-psgj5"] Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.447713 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.449787 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.563906 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.563952 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8pb\" (UniqueName: \"kubernetes.io/projected/c19bc4c1-5228-4f72-9260-24c7501fcf29-kube-api-access-7v8pb\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.563988 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-additional-scripts\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.564039 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-scripts\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.564059 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-log-ovn\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.564082 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run-ovn\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.664942 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-additional-scripts\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665033 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-scripts\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665067 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-log-ovn\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665096 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run-ovn\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665196 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665247 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8pb\" (UniqueName: \"kubernetes.io/projected/c19bc4c1-5228-4f72-9260-24c7501fcf29-kube-api-access-7v8pb\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665402 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-log-ovn\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665446 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run-ovn\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665475 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.665795 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-additional-scripts\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.668477 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-scripts\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.710088 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8pb\" (UniqueName: \"kubernetes.io/projected/c19bc4c1-5228-4f72-9260-24c7501fcf29-kube-api-access-7v8pb\") pod \"ovn-controller-fvqqp-config-psgj5\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:30 crc kubenswrapper[4823]: I1216 07:17:30.783707 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:31 crc kubenswrapper[4823]: I1216 07:17:31.578334 4823 generic.go:334] "Generic (PLEG): container finished" podID="a686a945-8fa0-406c-ac01-cf061c865a28" containerID="0639ca39d4b510f82c5a92153f15cb0546ff06018f3f66e0dd1e7b8d07959478" exitCode=0 Dec 16 07:17:31 crc kubenswrapper[4823]: I1216 07:17:31.578401 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a686a945-8fa0-406c-ac01-cf061c865a28","Type":"ContainerDied","Data":"0639ca39d4b510f82c5a92153f15cb0546ff06018f3f66e0dd1e7b8d07959478"} Dec 16 07:17:32 crc kubenswrapper[4823]: W1216 07:17:32.380376 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e02e173_17cf_486b_9c4a_b68aa6879f97.slice/crio-42590f3e75a61120a03aa37caa5ebe4345cea087163b3825eb6bea600955d83a WatchSource:0}: Error finding container 42590f3e75a61120a03aa37caa5ebe4345cea087163b3825eb6bea600955d83a: Status 404 returned error can't find the container with id 42590f3e75a61120a03aa37caa5ebe4345cea087163b3825eb6bea600955d83a Dec 16 07:17:32 crc kubenswrapper[4823]: E1216 07:17:32.431894 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f" Dec 16 07:17:32 crc kubenswrapper[4823]: E1216 07:17:32.432154 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l25df,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-2mqx2_openstack(4506b142-a95e-4cf3-a000-56fbee5e024d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:17:32 crc kubenswrapper[4823]: E1216 07:17:32.433540 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-2mqx2" podUID="4506b142-a95e-4cf3-a000-56fbee5e024d" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.605383 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nhgg2" event={"ID":"5e02e173-17cf-486b-9c4a-b68aa6879f97","Type":"ContainerStarted","Data":"42590f3e75a61120a03aa37caa5ebe4345cea087163b3825eb6bea600955d83a"} Dec 16 07:17:32 crc kubenswrapper[4823]: E1216 07:17:32.626192 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f\\\"\"" pod="openstack/glance-db-sync-2mqx2" podUID="4506b142-a95e-4cf3-a000-56fbee5e024d" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.687250 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.804888 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-sb\") pod \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.804970 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p57nq\" (UniqueName: \"kubernetes.io/projected/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-kube-api-access-p57nq\") pod \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.805036 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-config\") pod \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.805069 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-dns-svc\") pod \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.805145 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-nb\") pod \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\" (UID: \"9f3f8b57-525e-4d99-95c5-a4f41b0329c3\") " Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.811323 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-kube-api-access-p57nq" (OuterVolumeSpecName: "kube-api-access-p57nq") pod "9f3f8b57-525e-4d99-95c5-a4f41b0329c3" (UID: "9f3f8b57-525e-4d99-95c5-a4f41b0329c3"). InnerVolumeSpecName "kube-api-access-p57nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.854284 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-config" (OuterVolumeSpecName: "config") pod "9f3f8b57-525e-4d99-95c5-a4f41b0329c3" (UID: "9f3f8b57-525e-4d99-95c5-a4f41b0329c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.857809 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f3f8b57-525e-4d99-95c5-a4f41b0329c3" (UID: "9f3f8b57-525e-4d99-95c5-a4f41b0329c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.868533 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f3f8b57-525e-4d99-95c5-a4f41b0329c3" (UID: "9f3f8b57-525e-4d99-95c5-a4f41b0329c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.874749 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f3f8b57-525e-4d99-95c5-a4f41b0329c3" (UID: "9f3f8b57-525e-4d99-95c5-a4f41b0329c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.878525 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvqqp-config-psgj5"] Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.907064 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p57nq\" (UniqueName: \"kubernetes.io/projected/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-kube-api-access-p57nq\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.907108 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.907120 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.907131 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:32 crc kubenswrapper[4823]: I1216 07:17:32.907144 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f3f8b57-525e-4d99-95c5-a4f41b0329c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.211576 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:33 crc kubenswrapper[4823]: E1216 07:17:33.211762 4823 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:17:33 crc kubenswrapper[4823]: E1216 07:17:33.211877 4823 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:17:33 crc kubenswrapper[4823]: E1216 07:17:33.211930 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift podName:37eade87-02f6-4584-87d3-9b22e16ad915 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:49.211914136 +0000 UTC m=+1347.700480259 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift") pod "swift-storage-0" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915") : configmap "swift-ring-files" not found Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.633129 4823 generic.go:334] "Generic (PLEG): container finished" podID="c19bc4c1-5228-4f72-9260-24c7501fcf29" containerID="982d249f69b777a6b32b89fdb716b811aa505c0b63a3d52a07e50118e0f78094" exitCode=0 Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.633216 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-psgj5" event={"ID":"c19bc4c1-5228-4f72-9260-24c7501fcf29","Type":"ContainerDied","Data":"982d249f69b777a6b32b89fdb716b811aa505c0b63a3d52a07e50118e0f78094"} Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.633487 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-psgj5" event={"ID":"c19bc4c1-5228-4f72-9260-24c7501fcf29","Type":"ContainerStarted","Data":"2cab6e8ad507a78d3fa6830bfdfa1987d106a0ee66c3c17306e0a3363d4194bb"} Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.635580 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921"} Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.637816 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" event={"ID":"9f3f8b57-525e-4d99-95c5-a4f41b0329c3","Type":"ContainerDied","Data":"a7d9ba5c1a561ac6ed74b0ca1bc1c5fb58bf72ea5baa8dba7d19c527c0b95ae9"} Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.637864 4823 scope.go:117] "RemoveContainer" containerID="17cc1e8fd8348b3bf8e06dcbe2d5aceb8cad9309336edad706be0035289e384a" Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.637966 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9fb9c" Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.656114 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a686a945-8fa0-406c-ac01-cf061c865a28","Type":"ContainerStarted","Data":"36a98e82cbcb4bee731b20517aebf25ec378c019a17c67f3b8b2c9437196612b"} Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.656375 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.676653 4823 scope.go:117] "RemoveContainer" containerID="05a10d39894185589e65377738a6d7c93d73da48bd92ab655c9d06377116e944" Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.692597 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9fb9c"] Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.700697 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9fb9c"] Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.717428 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.077819114 podStartE2EDuration="1m13.717409299s" podCreationTimestamp="2025-12-16 07:16:20 +0000 UTC" firstStartedPulling="2025-12-16 07:16:22.808899267 +0000 UTC m=+1261.297465390" lastFinishedPulling="2025-12-16 07:16:57.448489452 +0000 UTC m=+1295.937055575" observedRunningTime="2025-12-16 07:17:33.711628148 +0000 UTC m=+1332.200194271" watchObservedRunningTime="2025-12-16 07:17:33.717409299 +0000 UTC m=+1332.205975412" Dec 16 07:17:33 crc kubenswrapper[4823]: I1216 07:17:33.783763 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" path="/var/lib/kubelet/pods/9f3f8b57-525e-4d99-95c5-a4f41b0329c3/volumes" Dec 16 07:17:34 crc kubenswrapper[4823]: I1216 07:17:34.933144 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fvqqp" Dec 16 07:17:35 crc kubenswrapper[4823]: I1216 07:17:35.676839 4823 generic.go:334] "Generic (PLEG): container finished" podID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerID="bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae" exitCode=0 Dec 16 07:17:35 crc kubenswrapper[4823]: I1216 07:17:35.676982 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1","Type":"ContainerDied","Data":"bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae"} Dec 16 07:17:36 crc kubenswrapper[4823]: I1216 07:17:36.686335 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-psgj5" event={"ID":"c19bc4c1-5228-4f72-9260-24c7501fcf29","Type":"ContainerDied","Data":"2cab6e8ad507a78d3fa6830bfdfa1987d106a0ee66c3c17306e0a3363d4194bb"} Dec 16 07:17:36 crc kubenswrapper[4823]: I1216 07:17:36.686391 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cab6e8ad507a78d3fa6830bfdfa1987d106a0ee66c3c17306e0a3363d4194bb" Dec 16 07:17:36 crc kubenswrapper[4823]: I1216 07:17:36.936348 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.087794 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run-ovn\") pod \"c19bc4c1-5228-4f72-9260-24c7501fcf29\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.087858 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-scripts\") pod \"c19bc4c1-5228-4f72-9260-24c7501fcf29\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.087893 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v8pb\" (UniqueName: \"kubernetes.io/projected/c19bc4c1-5228-4f72-9260-24c7501fcf29-kube-api-access-7v8pb\") pod \"c19bc4c1-5228-4f72-9260-24c7501fcf29\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.087924 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-additional-scripts\") pod \"c19bc4c1-5228-4f72-9260-24c7501fcf29\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.087957 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run\") pod \"c19bc4c1-5228-4f72-9260-24c7501fcf29\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.087962 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c19bc4c1-5228-4f72-9260-24c7501fcf29" (UID: "c19bc4c1-5228-4f72-9260-24c7501fcf29"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.088073 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run" (OuterVolumeSpecName: "var-run") pod "c19bc4c1-5228-4f72-9260-24c7501fcf29" (UID: "c19bc4c1-5228-4f72-9260-24c7501fcf29"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.088645 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c19bc4c1-5228-4f72-9260-24c7501fcf29" (UID: "c19bc4c1-5228-4f72-9260-24c7501fcf29"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.088712 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-log-ovn\") pod \"c19bc4c1-5228-4f72-9260-24c7501fcf29\" (UID: \"c19bc4c1-5228-4f72-9260-24c7501fcf29\") " Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.088787 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c19bc4c1-5228-4f72-9260-24c7501fcf29" (UID: "c19bc4c1-5228-4f72-9260-24c7501fcf29"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.089131 4823 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.089147 4823 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.089158 4823 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.089166 4823 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c19bc4c1-5228-4f72-9260-24c7501fcf29-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.089171 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-scripts" (OuterVolumeSpecName: "scripts") pod "c19bc4c1-5228-4f72-9260-24c7501fcf29" (UID: "c19bc4c1-5228-4f72-9260-24c7501fcf29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.094587 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19bc4c1-5228-4f72-9260-24c7501fcf29-kube-api-access-7v8pb" (OuterVolumeSpecName: "kube-api-access-7v8pb") pod "c19bc4c1-5228-4f72-9260-24c7501fcf29" (UID: "c19bc4c1-5228-4f72-9260-24c7501fcf29"). InnerVolumeSpecName "kube-api-access-7v8pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.190556 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c19bc4c1-5228-4f72-9260-24c7501fcf29-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.190593 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v8pb\" (UniqueName: \"kubernetes.io/projected/c19bc4c1-5228-4f72-9260-24c7501fcf29-kube-api-access-7v8pb\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.694779 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nhgg2" event={"ID":"5e02e173-17cf-486b-9c4a-b68aa6879f97","Type":"ContainerStarted","Data":"cf3000c3c19630c7b52fe3ee392070cdf29cc20770328f3c4cee0ca752e7ce59"} Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.697161 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1","Type":"ContainerStarted","Data":"51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720"} Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.697186 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-psgj5" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.697552 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.723760 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nhgg2" podStartSLOduration=12.358784003 podStartE2EDuration="16.723742605s" podCreationTimestamp="2025-12-16 07:17:21 +0000 UTC" firstStartedPulling="2025-12-16 07:17:32.392682792 +0000 UTC m=+1330.881248915" lastFinishedPulling="2025-12-16 07:17:36.757641374 +0000 UTC m=+1335.246207517" observedRunningTime="2025-12-16 07:17:37.719247534 +0000 UTC m=+1336.207813657" watchObservedRunningTime="2025-12-16 07:17:37.723742605 +0000 UTC m=+1336.212308728" Dec 16 07:17:37 crc kubenswrapper[4823]: I1216 07:17:37.745155 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371958.109636 podStartE2EDuration="1m18.745138685s" podCreationTimestamp="2025-12-16 07:16:19 +0000 UTC" firstStartedPulling="2025-12-16 07:16:21.961870023 +0000 UTC m=+1260.450436146" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:37.740928663 +0000 UTC m=+1336.229494796" watchObservedRunningTime="2025-12-16 07:17:37.745138685 +0000 UTC m=+1336.233704808" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.041084 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvqqp-config-psgj5"] Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.047829 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fvqqp-config-psgj5"] Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.155171 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fvqqp-config-hzdbj"] Dec 16 07:17:38 crc kubenswrapper[4823]: E1216 07:17:38.155570 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="dnsmasq-dns" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.155593 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="dnsmasq-dns" Dec 16 07:17:38 crc kubenswrapper[4823]: E1216 07:17:38.155617 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="init" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.155625 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="init" Dec 16 07:17:38 crc kubenswrapper[4823]: E1216 07:17:38.155650 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19bc4c1-5228-4f72-9260-24c7501fcf29" containerName="ovn-config" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.155657 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19bc4c1-5228-4f72-9260-24c7501fcf29" containerName="ovn-config" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.155862 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3f8b57-525e-4d99-95c5-a4f41b0329c3" containerName="dnsmasq-dns" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.155889 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19bc4c1-5228-4f72-9260-24c7501fcf29" containerName="ovn-config" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.156556 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.159378 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.175661 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvqqp-config-hzdbj"] Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.209778 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-scripts\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.209835 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-log-ovn\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.210044 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.210142 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6kqc\" (UniqueName: \"kubernetes.io/projected/e3533a65-c660-4e94-a820-1488e9eb1108-kube-api-access-w6kqc\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.210484 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run-ovn\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.210571 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-additional-scripts\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312440 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312516 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6kqc\" (UniqueName: \"kubernetes.io/projected/e3533a65-c660-4e94-a820-1488e9eb1108-kube-api-access-w6kqc\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312610 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run-ovn\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312642 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-additional-scripts\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312667 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-scripts\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312686 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-log-ovn\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312813 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-log-ovn\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.312864 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run-ovn\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.313629 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-additional-scripts\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.315294 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-scripts\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.333901 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6kqc\" (UniqueName: \"kubernetes.io/projected/e3533a65-c660-4e94-a820-1488e9eb1108-kube-api-access-w6kqc\") pod \"ovn-controller-fvqqp-config-hzdbj\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.477087 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:38 crc kubenswrapper[4823]: I1216 07:17:38.918334 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fvqqp-config-hzdbj"] Dec 16 07:17:39 crc kubenswrapper[4823]: I1216 07:17:39.718157 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-hzdbj" event={"ID":"e3533a65-c660-4e94-a820-1488e9eb1108","Type":"ContainerStarted","Data":"178f692ca2b4248aba6322541c8c8d404e1f3755c471036e3a5c112f6767916d"} Dec 16 07:17:39 crc kubenswrapper[4823]: I1216 07:17:39.718708 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-hzdbj" event={"ID":"e3533a65-c660-4e94-a820-1488e9eb1108","Type":"ContainerStarted","Data":"2b4354169bf981eecb6706930288e2b8c49b75e0977df630c0fac645f56802f3"} Dec 16 07:17:39 crc kubenswrapper[4823]: I1216 07:17:39.740179 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fvqqp-config-hzdbj" podStartSLOduration=1.740160232 podStartE2EDuration="1.740160232s" podCreationTimestamp="2025-12-16 07:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:39.735554358 +0000 UTC m=+1338.224120511" watchObservedRunningTime="2025-12-16 07:17:39.740160232 +0000 UTC m=+1338.228726355" Dec 16 07:17:39 crc kubenswrapper[4823]: I1216 07:17:39.782378 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c19bc4c1-5228-4f72-9260-24c7501fcf29" path="/var/lib/kubelet/pods/c19bc4c1-5228-4f72-9260-24c7501fcf29/volumes" Dec 16 07:17:40 crc kubenswrapper[4823]: I1216 07:17:40.725994 4823 generic.go:334] "Generic (PLEG): container finished" podID="e3533a65-c660-4e94-a820-1488e9eb1108" containerID="178f692ca2b4248aba6322541c8c8d404e1f3755c471036e3a5c112f6767916d" exitCode=0 Dec 16 07:17:40 crc kubenswrapper[4823]: I1216 07:17:40.726056 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-hzdbj" event={"ID":"e3533a65-c660-4e94-a820-1488e9eb1108","Type":"ContainerDied","Data":"178f692ca2b4248aba6322541c8c8d404e1f3755c471036e3a5c112f6767916d"} Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.128010 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182303 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-scripts\") pod \"e3533a65-c660-4e94-a820-1488e9eb1108\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182400 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run\") pod \"e3533a65-c660-4e94-a820-1488e9eb1108\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182459 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run" (OuterVolumeSpecName: "var-run") pod "e3533a65-c660-4e94-a820-1488e9eb1108" (UID: "e3533a65-c660-4e94-a820-1488e9eb1108"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182521 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6kqc\" (UniqueName: \"kubernetes.io/projected/e3533a65-c660-4e94-a820-1488e9eb1108-kube-api-access-w6kqc\") pod \"e3533a65-c660-4e94-a820-1488e9eb1108\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182687 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run-ovn\") pod \"e3533a65-c660-4e94-a820-1488e9eb1108\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182723 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-additional-scripts\") pod \"e3533a65-c660-4e94-a820-1488e9eb1108\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182760 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-log-ovn\") pod \"e3533a65-c660-4e94-a820-1488e9eb1108\" (UID: \"e3533a65-c660-4e94-a820-1488e9eb1108\") " Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.182849 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e3533a65-c660-4e94-a820-1488e9eb1108" (UID: "e3533a65-c660-4e94-a820-1488e9eb1108"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.183011 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e3533a65-c660-4e94-a820-1488e9eb1108" (UID: "e3533a65-c660-4e94-a820-1488e9eb1108"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.183537 4823 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.183554 4823 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.183567 4823 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3533a65-c660-4e94-a820-1488e9eb1108-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.183563 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e3533a65-c660-4e94-a820-1488e9eb1108" (UID: "e3533a65-c660-4e94-a820-1488e9eb1108"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.183820 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-scripts" (OuterVolumeSpecName: "scripts") pod "e3533a65-c660-4e94-a820-1488e9eb1108" (UID: "e3533a65-c660-4e94-a820-1488e9eb1108"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.201197 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3533a65-c660-4e94-a820-1488e9eb1108-kube-api-access-w6kqc" (OuterVolumeSpecName: "kube-api-access-w6kqc") pod "e3533a65-c660-4e94-a820-1488e9eb1108" (UID: "e3533a65-c660-4e94-a820-1488e9eb1108"). InnerVolumeSpecName "kube-api-access-w6kqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.285235 4823 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.285287 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3533a65-c660-4e94-a820-1488e9eb1108-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.285301 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6kqc\" (UniqueName: \"kubernetes.io/projected/e3533a65-c660-4e94-a820-1488e9eb1108-kube-api-access-w6kqc\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.752781 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp-config-hzdbj" event={"ID":"e3533a65-c660-4e94-a820-1488e9eb1108","Type":"ContainerDied","Data":"2b4354169bf981eecb6706930288e2b8c49b75e0977df630c0fac645f56802f3"} Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.753087 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b4354169bf981eecb6706930288e2b8c49b75e0977df630c0fac645f56802f3" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.752918 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp-config-hzdbj" Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.817435 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvqqp-config-hzdbj"] Dec 16 07:17:42 crc kubenswrapper[4823]: I1216 07:17:42.825039 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fvqqp-config-hzdbj"] Dec 16 07:17:43 crc kubenswrapper[4823]: I1216 07:17:43.781565 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3533a65-c660-4e94-a820-1488e9eb1108" path="/var/lib/kubelet/pods/e3533a65-c660-4e94-a820-1488e9eb1108/volumes" Dec 16 07:17:44 crc kubenswrapper[4823]: I1216 07:17:44.767706 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e02e173-17cf-486b-9c4a-b68aa6879f97" containerID="cf3000c3c19630c7b52fe3ee392070cdf29cc20770328f3c4cee0ca752e7ce59" exitCode=0 Dec 16 07:17:44 crc kubenswrapper[4823]: I1216 07:17:44.767752 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nhgg2" event={"ID":"5e02e173-17cf-486b-9c4a-b68aa6879f97","Type":"ContainerDied","Data":"cf3000c3c19630c7b52fe3ee392070cdf29cc20770328f3c4cee0ca752e7ce59"} Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.431831 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456566 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-swiftconf\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456698 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjhj7\" (UniqueName: \"kubernetes.io/projected/5e02e173-17cf-486b-9c4a-b68aa6879f97-kube-api-access-jjhj7\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456741 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e02e173-17cf-486b-9c4a-b68aa6879f97-etc-swift\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456767 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-scripts\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456823 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-combined-ca-bundle\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456844 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-ring-data-devices\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.456937 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-dispersionconf\") pod \"5e02e173-17cf-486b-9c4a-b68aa6879f97\" (UID: \"5e02e173-17cf-486b-9c4a-b68aa6879f97\") " Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.460954 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.461136 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e02e173-17cf-486b-9c4a-b68aa6879f97-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.469635 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e02e173-17cf-486b-9c4a-b68aa6879f97-kube-api-access-jjhj7" (OuterVolumeSpecName: "kube-api-access-jjhj7") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "kube-api-access-jjhj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.471819 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.484178 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-scripts" (OuterVolumeSpecName: "scripts") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.485051 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.485682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e02e173-17cf-486b-9c4a-b68aa6879f97" (UID: "5e02e173-17cf-486b-9c4a-b68aa6879f97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558627 4823 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558663 4823 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558676 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjhj7\" (UniqueName: \"kubernetes.io/projected/5e02e173-17cf-486b-9c4a-b68aa6879f97-kube-api-access-jjhj7\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558690 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e02e173-17cf-486b-9c4a-b68aa6879f97-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558703 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558714 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02e173-17cf-486b-9c4a-b68aa6879f97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.558725 4823 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e02e173-17cf-486b-9c4a-b68aa6879f97-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.785575 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nhgg2" event={"ID":"5e02e173-17cf-486b-9c4a-b68aa6879f97","Type":"ContainerDied","Data":"42590f3e75a61120a03aa37caa5ebe4345cea087163b3825eb6bea600955d83a"} Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.785621 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42590f3e75a61120a03aa37caa5ebe4345cea087163b3825eb6bea600955d83a" Dec 16 07:17:46 crc kubenswrapper[4823]: I1216 07:17:46.785685 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nhgg2" Dec 16 07:17:47 crc kubenswrapper[4823]: I1216 07:17:47.794882 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mqx2" event={"ID":"4506b142-a95e-4cf3-a000-56fbee5e024d","Type":"ContainerStarted","Data":"fbf17c728f21d60e2722f73e8d92c8f01170959769dc3b6af1de2092502dbd5f"} Dec 16 07:17:47 crc kubenswrapper[4823]: I1216 07:17:47.813319 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2mqx2" podStartSLOduration=2.92713481 podStartE2EDuration="33.813294277s" podCreationTimestamp="2025-12-16 07:17:14 +0000 UTC" firstStartedPulling="2025-12-16 07:17:15.790881083 +0000 UTC m=+1314.279447206" lastFinishedPulling="2025-12-16 07:17:46.67704055 +0000 UTC m=+1345.165606673" observedRunningTime="2025-12-16 07:17:47.810072456 +0000 UTC m=+1346.298638599" watchObservedRunningTime="2025-12-16 07:17:47.813294277 +0000 UTC m=+1346.301860400" Dec 16 07:17:49 crc kubenswrapper[4823]: I1216 07:17:49.307403 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:49 crc kubenswrapper[4823]: I1216 07:17:49.314985 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"swift-storage-0\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " pod="openstack/swift-storage-0" Dec 16 07:17:49 crc kubenswrapper[4823]: I1216 07:17:49.367934 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:17:49 crc kubenswrapper[4823]: I1216 07:17:49.907432 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:17:50 crc kubenswrapper[4823]: I1216 07:17:50.816545 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"e74114a842e19517f0819c0014b296da5863d1ef5807dc31915c92d2558cf539"} Dec 16 07:17:51 crc kubenswrapper[4823]: I1216 07:17:51.173322 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:17:51 crc kubenswrapper[4823]: I1216 07:17:51.573244 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 07:17:51 crc kubenswrapper[4823]: I1216 07:17:51.836485 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"a492d0597a24fbc3874db2d66724810617a47a1b04e07bd6166546bf01c14b03"} Dec 16 07:17:52 crc kubenswrapper[4823]: I1216 07:17:52.847197 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"9857b55eb51a54f3ae493111d268c42a0d2bc195ef3b7082fc757220e93cba07"} Dec 16 07:17:52 crc kubenswrapper[4823]: I1216 07:17:52.847238 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"037ada7a883b0afa2d539ebbbabaf8e1ff97dd775ed349460d0029680d2b1517"} Dec 16 07:17:52 crc kubenswrapper[4823]: I1216 07:17:52.847247 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"e40b9ddd3f7fc60ce93f808d19e11679050ad9b41de42d02b22ca40a92083f09"} Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.806847 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8fktl"] Dec 16 07:17:53 crc kubenswrapper[4823]: E1216 07:17:53.807668 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e02e173-17cf-486b-9c4a-b68aa6879f97" containerName="swift-ring-rebalance" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.807690 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e02e173-17cf-486b-9c4a-b68aa6879f97" containerName="swift-ring-rebalance" Dec 16 07:17:53 crc kubenswrapper[4823]: E1216 07:17:53.807718 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3533a65-c660-4e94-a820-1488e9eb1108" containerName="ovn-config" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.807727 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3533a65-c660-4e94-a820-1488e9eb1108" containerName="ovn-config" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.807948 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e02e173-17cf-486b-9c4a-b68aa6879f97" containerName="swift-ring-rebalance" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.807978 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3533a65-c660-4e94-a820-1488e9eb1108" containerName="ovn-config" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.818299 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.823866 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8fktl"] Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.896255 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-operator-scripts\") pod \"cinder-db-create-8fktl\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.896447 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974vh\" (UniqueName: \"kubernetes.io/projected/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-kube-api-access-974vh\") pod \"cinder-db-create-8fktl\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.922885 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5a20-account-create-update-42fv4"] Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.924010 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.929674 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.945521 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dbq9q"] Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.946552 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.967183 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5a20-account-create-update-42fv4"] Dec 16 07:17:53 crc kubenswrapper[4823]: I1216 07:17:53.999409 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dbq9q"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.018735 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7rr\" (UniqueName: \"kubernetes.io/projected/244ca852-a6d0-4537-8f87-b52b1237ff9b-kube-api-access-9t7rr\") pod \"barbican-db-create-dbq9q\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.019139 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974vh\" (UniqueName: \"kubernetes.io/projected/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-kube-api-access-974vh\") pod \"cinder-db-create-8fktl\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.019354 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/244ca852-a6d0-4537-8f87-b52b1237ff9b-operator-scripts\") pod \"barbican-db-create-dbq9q\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.019502 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-operator-scripts\") pod \"cinder-db-create-8fktl\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.019628 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-operator-scripts\") pod \"barbican-5a20-account-create-update-42fv4\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.019946 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdxv\" (UniqueName: \"kubernetes.io/projected/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-kube-api-access-fmdxv\") pod \"barbican-5a20-account-create-update-42fv4\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.021298 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-operator-scripts\") pod \"cinder-db-create-8fktl\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.064826 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974vh\" (UniqueName: \"kubernetes.io/projected/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-kube-api-access-974vh\") pod \"cinder-db-create-8fktl\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.064915 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1f3e-account-create-update-qtwg7"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.066890 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.081922 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.081932 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1f3e-account-create-update-qtwg7"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.114302 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2jnpg"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.116018 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.122587 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wtcrq" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.122737 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.122789 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.122911 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.123700 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/490618c1-e6b6-4546-86ea-27cf18723a7a-operator-scripts\") pod \"cinder-1f3e-account-create-update-qtwg7\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.123752 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdxv\" (UniqueName: \"kubernetes.io/projected/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-kube-api-access-fmdxv\") pod \"barbican-5a20-account-create-update-42fv4\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.123968 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t7rr\" (UniqueName: \"kubernetes.io/projected/244ca852-a6d0-4537-8f87-b52b1237ff9b-kube-api-access-9t7rr\") pod \"barbican-db-create-dbq9q\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.124008 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzq2v\" (UniqueName: \"kubernetes.io/projected/490618c1-e6b6-4546-86ea-27cf18723a7a-kube-api-access-mzq2v\") pod \"cinder-1f3e-account-create-update-qtwg7\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.124077 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/244ca852-a6d0-4537-8f87-b52b1237ff9b-operator-scripts\") pod \"barbican-db-create-dbq9q\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.124108 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-operator-scripts\") pod \"barbican-5a20-account-create-update-42fv4\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.124846 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/244ca852-a6d0-4537-8f87-b52b1237ff9b-operator-scripts\") pod \"barbican-db-create-dbq9q\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.124927 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-operator-scripts\") pod \"barbican-5a20-account-create-update-42fv4\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.131977 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2jnpg"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.140254 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8fktl" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.148863 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdxv\" (UniqueName: \"kubernetes.io/projected/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-kube-api-access-fmdxv\") pod \"barbican-5a20-account-create-update-42fv4\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.158016 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t7rr\" (UniqueName: \"kubernetes.io/projected/244ca852-a6d0-4537-8f87-b52b1237ff9b-kube-api-access-9t7rr\") pod \"barbican-db-create-dbq9q\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.226654 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzq2v\" (UniqueName: \"kubernetes.io/projected/490618c1-e6b6-4546-86ea-27cf18723a7a-kube-api-access-mzq2v\") pod \"cinder-1f3e-account-create-update-qtwg7\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.227107 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6tkh\" (UniqueName: \"kubernetes.io/projected/dee4f17f-49e7-4f83-b138-a913f67757b3-kube-api-access-f6tkh\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.227162 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/490618c1-e6b6-4546-86ea-27cf18723a7a-operator-scripts\") pod \"cinder-1f3e-account-create-update-qtwg7\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.227204 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-config-data\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.227262 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-combined-ca-bundle\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.228130 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/490618c1-e6b6-4546-86ea-27cf18723a7a-operator-scripts\") pod \"cinder-1f3e-account-create-update-qtwg7\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.241397 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.250630 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzq2v\" (UniqueName: \"kubernetes.io/projected/490618c1-e6b6-4546-86ea-27cf18723a7a-kube-api-access-mzq2v\") pod \"cinder-1f3e-account-create-update-qtwg7\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.322956 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dbq9q" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.328732 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-combined-ca-bundle\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.328800 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6tkh\" (UniqueName: \"kubernetes.io/projected/dee4f17f-49e7-4f83-b138-a913f67757b3-kube-api-access-f6tkh\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.328874 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-config-data\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.332138 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dqg6x"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.333214 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.333737 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-config-data\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.334502 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-combined-ca-bundle\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.355777 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6tkh\" (UniqueName: \"kubernetes.io/projected/dee4f17f-49e7-4f83-b138-a913f67757b3-kube-api-access-f6tkh\") pod \"keystone-db-sync-2jnpg\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.363216 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ba48-account-create-update-tk44m"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.366475 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.373101 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.378386 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dqg6x"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.388242 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ba48-account-create-update-tk44m"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.418871 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.430078 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a725db70-608c-4a15-8d30-88bf5dbb764f-operator-scripts\") pod \"neutron-ba48-account-create-update-tk44m\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.430191 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smhrd\" (UniqueName: \"kubernetes.io/projected/dc43f91c-775d-4641-9192-53ddf96bd2b2-kube-api-access-smhrd\") pod \"neutron-db-create-dqg6x\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.430274 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc43f91c-775d-4641-9192-53ddf96bd2b2-operator-scripts\") pod \"neutron-db-create-dqg6x\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.430303 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx8cr\" (UniqueName: \"kubernetes.io/projected/a725db70-608c-4a15-8d30-88bf5dbb764f-kube-api-access-gx8cr\") pod \"neutron-ba48-account-create-update-tk44m\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.434421 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.532254 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc43f91c-775d-4641-9192-53ddf96bd2b2-operator-scripts\") pod \"neutron-db-create-dqg6x\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.532307 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8cr\" (UniqueName: \"kubernetes.io/projected/a725db70-608c-4a15-8d30-88bf5dbb764f-kube-api-access-gx8cr\") pod \"neutron-ba48-account-create-update-tk44m\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.532370 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a725db70-608c-4a15-8d30-88bf5dbb764f-operator-scripts\") pod \"neutron-ba48-account-create-update-tk44m\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.532450 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smhrd\" (UniqueName: \"kubernetes.io/projected/dc43f91c-775d-4641-9192-53ddf96bd2b2-kube-api-access-smhrd\") pod \"neutron-db-create-dqg6x\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.534436 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc43f91c-775d-4641-9192-53ddf96bd2b2-operator-scripts\") pod \"neutron-db-create-dqg6x\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.534813 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a725db70-608c-4a15-8d30-88bf5dbb764f-operator-scripts\") pod \"neutron-ba48-account-create-update-tk44m\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.554181 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smhrd\" (UniqueName: \"kubernetes.io/projected/dc43f91c-775d-4641-9192-53ddf96bd2b2-kube-api-access-smhrd\") pod \"neutron-db-create-dqg6x\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.555881 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8cr\" (UniqueName: \"kubernetes.io/projected/a725db70-608c-4a15-8d30-88bf5dbb764f-kube-api-access-gx8cr\") pod \"neutron-ba48-account-create-update-tk44m\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.659403 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dqg6x" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.721623 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.764437 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8fktl"] Dec 16 07:17:54 crc kubenswrapper[4823]: W1216 07:17:54.789165 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e42473_2988_4fdb_8c8d_55a0d4e3a6bf.slice/crio-610cc8093a68d59f35e1fdd2ccf3659ca8efa24f809b8b28b42e9ed25f235db2 WatchSource:0}: Error finding container 610cc8093a68d59f35e1fdd2ccf3659ca8efa24f809b8b28b42e9ed25f235db2: Status 404 returned error can't find the container with id 610cc8093a68d59f35e1fdd2ccf3659ca8efa24f809b8b28b42e9ed25f235db2 Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.887561 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5a20-account-create-update-42fv4"] Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.923511 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"6989d85752f4e1b6c7b23a46754686007edf09212f93e356aa9e002490d63f86"} Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.935555 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8fktl" event={"ID":"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf","Type":"ContainerStarted","Data":"610cc8093a68d59f35e1fdd2ccf3659ca8efa24f809b8b28b42e9ed25f235db2"} Dec 16 07:17:54 crc kubenswrapper[4823]: I1216 07:17:54.998411 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dbq9q"] Dec 16 07:17:55 crc kubenswrapper[4823]: I1216 07:17:55.183001 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1f3e-account-create-update-qtwg7"] Dec 16 07:17:55 crc kubenswrapper[4823]: I1216 07:17:55.216374 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2jnpg"] Dec 16 07:17:55 crc kubenswrapper[4823]: I1216 07:17:55.346658 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dqg6x"] Dec 16 07:17:55 crc kubenswrapper[4823]: I1216 07:17:55.442641 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ba48-account-create-update-tk44m"] Dec 16 07:17:55 crc kubenswrapper[4823]: I1216 07:17:55.944267 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a20-account-create-update-42fv4" event={"ID":"4c5cb375-fe40-481c-a59e-a0f2ae2322bc","Type":"ContainerStarted","Data":"32c0e352f42f76e5ffcb921e8e918670d2886c2b5957c2c19528d342ab203fc1"} Dec 16 07:17:57 crc kubenswrapper[4823]: I1216 07:17:57.962404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"3e8bd97535cc7d73ba58df356afd74ec5adc282b78f6bd60a29d41243373dfe8"} Dec 16 07:17:58 crc kubenswrapper[4823]: W1216 07:17:58.312714 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod244ca852_a6d0_4537_8f87_b52b1237ff9b.slice/crio-fb7997faa7be476e050b85524972e8c95cf4767fa773e5bb59aa44b50df98b56 WatchSource:0}: Error finding container fb7997faa7be476e050b85524972e8c95cf4767fa773e5bb59aa44b50df98b56: Status 404 returned error can't find the container with id fb7997faa7be476e050b85524972e8c95cf4767fa773e5bb59aa44b50df98b56 Dec 16 07:17:58 crc kubenswrapper[4823]: W1216 07:17:58.318198 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee4f17f_49e7_4f83_b138_a913f67757b3.slice/crio-ae3505fc5d19a24ad5bee2b4fb305097018ea68221b125d537480db75e723878 WatchSource:0}: Error finding container ae3505fc5d19a24ad5bee2b4fb305097018ea68221b125d537480db75e723878: Status 404 returned error can't find the container with id ae3505fc5d19a24ad5bee2b4fb305097018ea68221b125d537480db75e723878 Dec 16 07:17:58 crc kubenswrapper[4823]: W1216 07:17:58.322643 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc43f91c_775d_4641_9192_53ddf96bd2b2.slice/crio-2b383d6602fd2bcfecd4ac3f78427201d68ccdd8c44cbd324e56f81fbc86fe21 WatchSource:0}: Error finding container 2b383d6602fd2bcfecd4ac3f78427201d68ccdd8c44cbd324e56f81fbc86fe21: Status 404 returned error can't find the container with id 2b383d6602fd2bcfecd4ac3f78427201d68ccdd8c44cbd324e56f81fbc86fe21 Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.977691 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8fktl" event={"ID":"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf","Type":"ContainerStarted","Data":"cbea58da480f94db2a2ee35249963ed9ccff56507255870a694a1b9d2f6a6af6"} Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.982190 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2jnpg" event={"ID":"dee4f17f-49e7-4f83-b138-a913f67757b3","Type":"ContainerStarted","Data":"ae3505fc5d19a24ad5bee2b4fb305097018ea68221b125d537480db75e723878"} Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.987185 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"0867818f24c8ec64e592ab31aa5d2950ef78f3e7e0fe1694feaadae8d16fd195"} Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.987227 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"bad977d222921a4fb519d95600bc9d018f6a41b0993e19b99e544f9729b364ec"} Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.989980 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ba48-account-create-update-tk44m" event={"ID":"a725db70-608c-4a15-8d30-88bf5dbb764f","Type":"ContainerStarted","Data":"6fa7315b3a421e4a53c77f20200c6b69db9dcfebd2b05f6223c1276fbb6ac91e"} Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.990009 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ba48-account-create-update-tk44m" event={"ID":"a725db70-608c-4a15-8d30-88bf5dbb764f","Type":"ContainerStarted","Data":"6b9874fff90a54053f14226b3c5e90167a0750d1c931ff284ef8b5506854b57a"} Dec 16 07:17:58 crc kubenswrapper[4823]: I1216 07:17:58.999124 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a20-account-create-update-42fv4" event={"ID":"4c5cb375-fe40-481c-a59e-a0f2ae2322bc","Type":"ContainerStarted","Data":"3d8bf1eec57a6eef7e33e2d7523f1be1a8f7b422798ea3442862f3825e3f251e"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.001514 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f3e-account-create-update-qtwg7" event={"ID":"490618c1-e6b6-4546-86ea-27cf18723a7a","Type":"ContainerStarted","Data":"52aaf1d3bed1904d4e9402c6601a7b4cf9067783dfeabc37b488a2a4d81ee20f"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.001546 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f3e-account-create-update-qtwg7" event={"ID":"490618c1-e6b6-4546-86ea-27cf18723a7a","Type":"ContainerStarted","Data":"f018ceb5cf2aa9d5bc29c6f1453c9fb8a3cf76b14c0a55059adfe2422ea173d1"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.008518 4823 generic.go:334] "Generic (PLEG): container finished" podID="dc43f91c-775d-4641-9192-53ddf96bd2b2" containerID="fb4f73ed34378d9f3cbdb5b0ce00ba2183bba7bcb920f2a0c15d5c8a957ce220" exitCode=0 Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.008640 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dqg6x" event={"ID":"dc43f91c-775d-4641-9192-53ddf96bd2b2","Type":"ContainerDied","Data":"fb4f73ed34378d9f3cbdb5b0ce00ba2183bba7bcb920f2a0c15d5c8a957ce220"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.008688 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dqg6x" event={"ID":"dc43f91c-775d-4641-9192-53ddf96bd2b2","Type":"ContainerStarted","Data":"2b383d6602fd2bcfecd4ac3f78427201d68ccdd8c44cbd324e56f81fbc86fe21"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.017084 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dbq9q" event={"ID":"244ca852-a6d0-4537-8f87-b52b1237ff9b","Type":"ContainerStarted","Data":"d1c1a2e134858b3458134e2fa1a0775f9e25a7109ded0caff60bedc00c2090bb"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.017146 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dbq9q" event={"ID":"244ca852-a6d0-4537-8f87-b52b1237ff9b","Type":"ContainerStarted","Data":"fb7997faa7be476e050b85524972e8c95cf4767fa773e5bb59aa44b50df98b56"} Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.021501 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ba48-account-create-update-tk44m" podStartSLOduration=5.021479235 podStartE2EDuration="5.021479235s" podCreationTimestamp="2025-12-16 07:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:59.01814386 +0000 UTC m=+1357.506709993" watchObservedRunningTime="2025-12-16 07:17:59.021479235 +0000 UTC m=+1357.510045358" Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.028756 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-8fktl" podStartSLOduration=6.028739002 podStartE2EDuration="6.028739002s" podCreationTimestamp="2025-12-16 07:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:58.993149838 +0000 UTC m=+1357.481715961" watchObservedRunningTime="2025-12-16 07:17:59.028739002 +0000 UTC m=+1357.517305125" Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.075655 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1f3e-account-create-update-qtwg7" podStartSLOduration=5.0756318 podStartE2EDuration="5.0756318s" podCreationTimestamp="2025-12-16 07:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:59.072326517 +0000 UTC m=+1357.560892630" watchObservedRunningTime="2025-12-16 07:17:59.0756318 +0000 UTC m=+1357.564197923" Dec 16 07:17:59 crc kubenswrapper[4823]: I1216 07:17:59.093515 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-5a20-account-create-update-42fv4" podStartSLOduration=6.093498459 podStartE2EDuration="6.093498459s" podCreationTimestamp="2025-12-16 07:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:17:59.089865446 +0000 UTC m=+1357.578431589" watchObservedRunningTime="2025-12-16 07:17:59.093498459 +0000 UTC m=+1357.582064582" Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.033385 4823 generic.go:334] "Generic (PLEG): container finished" podID="a725db70-608c-4a15-8d30-88bf5dbb764f" containerID="6fa7315b3a421e4a53c77f20200c6b69db9dcfebd2b05f6223c1276fbb6ac91e" exitCode=0 Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.033498 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ba48-account-create-update-tk44m" event={"ID":"a725db70-608c-4a15-8d30-88bf5dbb764f","Type":"ContainerDied","Data":"6fa7315b3a421e4a53c77f20200c6b69db9dcfebd2b05f6223c1276fbb6ac91e"} Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.037834 4823 generic.go:334] "Generic (PLEG): container finished" podID="4c5cb375-fe40-481c-a59e-a0f2ae2322bc" containerID="3d8bf1eec57a6eef7e33e2d7523f1be1a8f7b422798ea3442862f3825e3f251e" exitCode=0 Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.038082 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a20-account-create-update-42fv4" event={"ID":"4c5cb375-fe40-481c-a59e-a0f2ae2322bc","Type":"ContainerDied","Data":"3d8bf1eec57a6eef7e33e2d7523f1be1a8f7b422798ea3442862f3825e3f251e"} Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.040238 4823 generic.go:334] "Generic (PLEG): container finished" podID="490618c1-e6b6-4546-86ea-27cf18723a7a" containerID="52aaf1d3bed1904d4e9402c6601a7b4cf9067783dfeabc37b488a2a4d81ee20f" exitCode=0 Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.040324 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f3e-account-create-update-qtwg7" event={"ID":"490618c1-e6b6-4546-86ea-27cf18723a7a","Type":"ContainerDied","Data":"52aaf1d3bed1904d4e9402c6601a7b4cf9067783dfeabc37b488a2a4d81ee20f"} Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.042990 4823 generic.go:334] "Generic (PLEG): container finished" podID="244ca852-a6d0-4537-8f87-b52b1237ff9b" containerID="d1c1a2e134858b3458134e2fa1a0775f9e25a7109ded0caff60bedc00c2090bb" exitCode=0 Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.043044 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dbq9q" event={"ID":"244ca852-a6d0-4537-8f87-b52b1237ff9b","Type":"ContainerDied","Data":"d1c1a2e134858b3458134e2fa1a0775f9e25a7109ded0caff60bedc00c2090bb"} Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.046953 4823 generic.go:334] "Generic (PLEG): container finished" podID="d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" containerID="cbea58da480f94db2a2ee35249963ed9ccff56507255870a694a1b9d2f6a6af6" exitCode=0 Dec 16 07:18:00 crc kubenswrapper[4823]: I1216 07:18:00.047041 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8fktl" event={"ID":"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf","Type":"ContainerDied","Data":"cbea58da480f94db2a2ee35249963ed9ccff56507255870a694a1b9d2f6a6af6"} Dec 16 07:18:00 crc kubenswrapper[4823]: E1216 07:18:00.750368 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4506b142_a95e_4cf3_a000_56fbee5e024d.slice/crio-fbf17c728f21d60e2722f73e8d92c8f01170959769dc3b6af1de2092502dbd5f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4506b142_a95e_4cf3_a000_56fbee5e024d.slice/crio-conmon-fbf17c728f21d60e2722f73e8d92c8f01170959769dc3b6af1de2092502dbd5f.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:18:01 crc kubenswrapper[4823]: I1216 07:18:01.057680 4823 generic.go:334] "Generic (PLEG): container finished" podID="4506b142-a95e-4cf3-a000-56fbee5e024d" containerID="fbf17c728f21d60e2722f73e8d92c8f01170959769dc3b6af1de2092502dbd5f" exitCode=0 Dec 16 07:18:01 crc kubenswrapper[4823]: I1216 07:18:01.057788 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mqx2" event={"ID":"4506b142-a95e-4cf3-a000-56fbee5e024d","Type":"ContainerDied","Data":"fbf17c728f21d60e2722f73e8d92c8f01170959769dc3b6af1de2092502dbd5f"} Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.805551 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.827117 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.829835 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.836230 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dbq9q" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.869130 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dqg6x" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.888390 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8fktl" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.901499 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mqx2" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.918108 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/490618c1-e6b6-4546-86ea-27cf18723a7a-operator-scripts\") pod \"490618c1-e6b6-4546-86ea-27cf18723a7a\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.918271 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzq2v\" (UniqueName: \"kubernetes.io/projected/490618c1-e6b6-4546-86ea-27cf18723a7a-kube-api-access-mzq2v\") pod \"490618c1-e6b6-4546-86ea-27cf18723a7a\" (UID: \"490618c1-e6b6-4546-86ea-27cf18723a7a\") " Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.919501 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490618c1-e6b6-4546-86ea-27cf18723a7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "490618c1-e6b6-4546-86ea-27cf18723a7a" (UID: "490618c1-e6b6-4546-86ea-27cf18723a7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.922909 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/490618c1-e6b6-4546-86ea-27cf18723a7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:03 crc kubenswrapper[4823]: I1216 07:18:03.935521 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490618c1-e6b6-4546-86ea-27cf18723a7a-kube-api-access-mzq2v" (OuterVolumeSpecName: "kube-api-access-mzq2v") pod "490618c1-e6b6-4546-86ea-27cf18723a7a" (UID: "490618c1-e6b6-4546-86ea-27cf18723a7a"). InnerVolumeSpecName "kube-api-access-mzq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.024491 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/244ca852-a6d0-4537-8f87-b52b1237ff9b-operator-scripts\") pod \"244ca852-a6d0-4537-8f87-b52b1237ff9b\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.024564 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-config-data\") pod \"4506b142-a95e-4cf3-a000-56fbee5e024d\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.024602 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-operator-scripts\") pod \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.024669 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t7rr\" (UniqueName: \"kubernetes.io/projected/244ca852-a6d0-4537-8f87-b52b1237ff9b-kube-api-access-9t7rr\") pod \"244ca852-a6d0-4537-8f87-b52b1237ff9b\" (UID: \"244ca852-a6d0-4537-8f87-b52b1237ff9b\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025173 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smhrd\" (UniqueName: \"kubernetes.io/projected/dc43f91c-775d-4641-9192-53ddf96bd2b2-kube-api-access-smhrd\") pod \"dc43f91c-775d-4641-9192-53ddf96bd2b2\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025213 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-combined-ca-bundle\") pod \"4506b142-a95e-4cf3-a000-56fbee5e024d\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025248 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx8cr\" (UniqueName: \"kubernetes.io/projected/a725db70-608c-4a15-8d30-88bf5dbb764f-kube-api-access-gx8cr\") pod \"a725db70-608c-4a15-8d30-88bf5dbb764f\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025358 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a725db70-608c-4a15-8d30-88bf5dbb764f-operator-scripts\") pod \"a725db70-608c-4a15-8d30-88bf5dbb764f\" (UID: \"a725db70-608c-4a15-8d30-88bf5dbb764f\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025418 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974vh\" (UniqueName: \"kubernetes.io/projected/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-kube-api-access-974vh\") pod \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\" (UID: \"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025451 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-db-sync-config-data\") pod \"4506b142-a95e-4cf3-a000-56fbee5e024d\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025502 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-operator-scripts\") pod \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025557 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l25df\" (UniqueName: \"kubernetes.io/projected/4506b142-a95e-4cf3-a000-56fbee5e024d-kube-api-access-l25df\") pod \"4506b142-a95e-4cf3-a000-56fbee5e024d\" (UID: \"4506b142-a95e-4cf3-a000-56fbee5e024d\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.026243 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc43f91c-775d-4641-9192-53ddf96bd2b2-operator-scripts\") pod \"dc43f91c-775d-4641-9192-53ddf96bd2b2\" (UID: \"dc43f91c-775d-4641-9192-53ddf96bd2b2\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.026314 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmdxv\" (UniqueName: \"kubernetes.io/projected/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-kube-api-access-fmdxv\") pod \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\" (UID: \"4c5cb375-fe40-481c-a59e-a0f2ae2322bc\") " Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025015 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244ca852-a6d0-4537-8f87-b52b1237ff9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "244ca852-a6d0-4537-8f87-b52b1237ff9b" (UID: "244ca852-a6d0-4537-8f87-b52b1237ff9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.025101 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" (UID: "d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.026622 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c5cb375-fe40-481c-a59e-a0f2ae2322bc" (UID: "4c5cb375-fe40-481c-a59e-a0f2ae2322bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.026661 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a725db70-608c-4a15-8d30-88bf5dbb764f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a725db70-608c-4a15-8d30-88bf5dbb764f" (UID: "a725db70-608c-4a15-8d30-88bf5dbb764f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.027062 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzq2v\" (UniqueName: \"kubernetes.io/projected/490618c1-e6b6-4546-86ea-27cf18723a7a-kube-api-access-mzq2v\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.027082 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a725db70-608c-4a15-8d30-88bf5dbb764f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.027094 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.027107 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/244ca852-a6d0-4537-8f87-b52b1237ff9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.027119 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.028880 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4506b142-a95e-4cf3-a000-56fbee5e024d" (UID: "4506b142-a95e-4cf3-a000-56fbee5e024d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.029415 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc43f91c-775d-4641-9192-53ddf96bd2b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc43f91c-775d-4641-9192-53ddf96bd2b2" (UID: "dc43f91c-775d-4641-9192-53ddf96bd2b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.029869 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc43f91c-775d-4641-9192-53ddf96bd2b2-kube-api-access-smhrd" (OuterVolumeSpecName: "kube-api-access-smhrd") pod "dc43f91c-775d-4641-9192-53ddf96bd2b2" (UID: "dc43f91c-775d-4641-9192-53ddf96bd2b2"). InnerVolumeSpecName "kube-api-access-smhrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.030418 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-kube-api-access-974vh" (OuterVolumeSpecName: "kube-api-access-974vh") pod "d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" (UID: "d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf"). InnerVolumeSpecName "kube-api-access-974vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.030814 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-kube-api-access-fmdxv" (OuterVolumeSpecName: "kube-api-access-fmdxv") pod "4c5cb375-fe40-481c-a59e-a0f2ae2322bc" (UID: "4c5cb375-fe40-481c-a59e-a0f2ae2322bc"). InnerVolumeSpecName "kube-api-access-fmdxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.031831 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4506b142-a95e-4cf3-a000-56fbee5e024d-kube-api-access-l25df" (OuterVolumeSpecName: "kube-api-access-l25df") pod "4506b142-a95e-4cf3-a000-56fbee5e024d" (UID: "4506b142-a95e-4cf3-a000-56fbee5e024d"). InnerVolumeSpecName "kube-api-access-l25df". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.032314 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a725db70-608c-4a15-8d30-88bf5dbb764f-kube-api-access-gx8cr" (OuterVolumeSpecName: "kube-api-access-gx8cr") pod "a725db70-608c-4a15-8d30-88bf5dbb764f" (UID: "a725db70-608c-4a15-8d30-88bf5dbb764f"). InnerVolumeSpecName "kube-api-access-gx8cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.035301 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244ca852-a6d0-4537-8f87-b52b1237ff9b-kube-api-access-9t7rr" (OuterVolumeSpecName: "kube-api-access-9t7rr") pod "244ca852-a6d0-4537-8f87-b52b1237ff9b" (UID: "244ca852-a6d0-4537-8f87-b52b1237ff9b"). InnerVolumeSpecName "kube-api-access-9t7rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.064807 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4506b142-a95e-4cf3-a000-56fbee5e024d" (UID: "4506b142-a95e-4cf3-a000-56fbee5e024d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.078250 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-config-data" (OuterVolumeSpecName: "config-data") pod "4506b142-a95e-4cf3-a000-56fbee5e024d" (UID: "4506b142-a95e-4cf3-a000-56fbee5e024d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.084919 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8fktl" event={"ID":"d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf","Type":"ContainerDied","Data":"610cc8093a68d59f35e1fdd2ccf3659ca8efa24f809b8b28b42e9ed25f235db2"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.084967 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="610cc8093a68d59f35e1fdd2ccf3659ca8efa24f809b8b28b42e9ed25f235db2" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.085047 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8fktl" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.096732 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ba48-account-create-update-tk44m" event={"ID":"a725db70-608c-4a15-8d30-88bf5dbb764f","Type":"ContainerDied","Data":"6b9874fff90a54053f14226b3c5e90167a0750d1c931ff284ef8b5506854b57a"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.096795 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9874fff90a54053f14226b3c5e90167a0750d1c931ff284ef8b5506854b57a" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.096753 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ba48-account-create-update-tk44m" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.104231 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dbq9q" event={"ID":"244ca852-a6d0-4537-8f87-b52b1237ff9b","Type":"ContainerDied","Data":"fb7997faa7be476e050b85524972e8c95cf4767fa773e5bb59aa44b50df98b56"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.104272 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb7997faa7be476e050b85524972e8c95cf4767fa773e5bb59aa44b50df98b56" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.104330 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dbq9q" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.115368 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"01e5b8f2f03cdaee2d9aa0f7009e062e757b69095af1ac126d2b409afda22307"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.116843 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5a20-account-create-update-42fv4" event={"ID":"4c5cb375-fe40-481c-a59e-a0f2ae2322bc","Type":"ContainerDied","Data":"32c0e352f42f76e5ffcb921e8e918670d2886c2b5957c2c19528d342ab203fc1"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.117102 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c0e352f42f76e5ffcb921e8e918670d2886c2b5957c2c19528d342ab203fc1" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.117263 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5a20-account-create-update-42fv4" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.119457 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1f3e-account-create-update-qtwg7" event={"ID":"490618c1-e6b6-4546-86ea-27cf18723a7a","Type":"ContainerDied","Data":"f018ceb5cf2aa9d5bc29c6f1453c9fb8a3cf76b14c0a55059adfe2422ea173d1"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.119481 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f018ceb5cf2aa9d5bc29c6f1453c9fb8a3cf76b14c0a55059adfe2422ea173d1" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.119519 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1f3e-account-create-update-qtwg7" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.127927 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmdxv\" (UniqueName: \"kubernetes.io/projected/4c5cb375-fe40-481c-a59e-a0f2ae2322bc-kube-api-access-fmdxv\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.127995 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128014 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t7rr\" (UniqueName: \"kubernetes.io/projected/244ca852-a6d0-4537-8f87-b52b1237ff9b-kube-api-access-9t7rr\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128054 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smhrd\" (UniqueName: \"kubernetes.io/projected/dc43f91c-775d-4641-9192-53ddf96bd2b2-kube-api-access-smhrd\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128067 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128078 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx8cr\" (UniqueName: \"kubernetes.io/projected/a725db70-608c-4a15-8d30-88bf5dbb764f-kube-api-access-gx8cr\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128091 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974vh\" (UniqueName: \"kubernetes.io/projected/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf-kube-api-access-974vh\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128143 4823 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4506b142-a95e-4cf3-a000-56fbee5e024d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128159 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l25df\" (UniqueName: \"kubernetes.io/projected/4506b142-a95e-4cf3-a000-56fbee5e024d-kube-api-access-l25df\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.128171 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc43f91c-775d-4641-9192-53ddf96bd2b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.129405 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dqg6x" event={"ID":"dc43f91c-775d-4641-9192-53ddf96bd2b2","Type":"ContainerDied","Data":"2b383d6602fd2bcfecd4ac3f78427201d68ccdd8c44cbd324e56f81fbc86fe21"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.129435 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b383d6602fd2bcfecd4ac3f78427201d68ccdd8c44cbd324e56f81fbc86fe21" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.129500 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dqg6x" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.131239 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mqx2" event={"ID":"4506b142-a95e-4cf3-a000-56fbee5e024d","Type":"ContainerDied","Data":"f646c71e61816734c61e0d887295a31557a0b683fc094b983a72589fb6390a47"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.131265 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f646c71e61816734c61e0d887295a31557a0b683fc094b983a72589fb6390a47" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.131427 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mqx2" Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.133314 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2jnpg" event={"ID":"dee4f17f-49e7-4f83-b138-a913f67757b3","Type":"ContainerStarted","Data":"cd1dc48ce8bf98695f893a5adecf3b9f44bbc3521a4cef478e87f157c2618181"} Dec 16 07:18:04 crc kubenswrapper[4823]: I1216 07:18:04.843054 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2jnpg" podStartSLOduration=5.497753789 podStartE2EDuration="10.843034997s" podCreationTimestamp="2025-12-16 07:17:54 +0000 UTC" firstStartedPulling="2025-12-16 07:17:58.320528193 +0000 UTC m=+1356.809094316" lastFinishedPulling="2025-12-16 07:18:03.665809411 +0000 UTC m=+1362.154375524" observedRunningTime="2025-12-16 07:18:04.156160631 +0000 UTC m=+1362.644726754" watchObservedRunningTime="2025-12-16 07:18:04.843034997 +0000 UTC m=+1363.331601120" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.148934 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"40ee29e6ae29936dd852b2034a257b376daf068184e991736706829246c42569"} Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.149318 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"b1a5f1f8235f35f66a00999ce9d7e06be67e6583b5dc430df80fd71d14a63993"} Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.149336 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"c5d6df967dd64ce250c15ed15f061a8be5c2ace3ce71f17ecbb4eeb82eee16bb"} Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285201 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-4jj7z"] Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285678 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490618c1-e6b6-4546-86ea-27cf18723a7a" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285704 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="490618c1-e6b6-4546-86ea-27cf18723a7a" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285732 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285741 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285756 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a725db70-608c-4a15-8d30-88bf5dbb764f" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285763 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a725db70-608c-4a15-8d30-88bf5dbb764f" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285777 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244ca852-a6d0-4537-8f87-b52b1237ff9b" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285784 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="244ca852-a6d0-4537-8f87-b52b1237ff9b" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285800 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4506b142-a95e-4cf3-a000-56fbee5e024d" containerName="glance-db-sync" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285809 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4506b142-a95e-4cf3-a000-56fbee5e024d" containerName="glance-db-sync" Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285827 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc43f91c-775d-4641-9192-53ddf96bd2b2" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285836 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc43f91c-775d-4641-9192-53ddf96bd2b2" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: E1216 07:18:05.285857 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5cb375-fe40-481c-a59e-a0f2ae2322bc" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.285866 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5cb375-fe40-481c-a59e-a0f2ae2322bc" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287258 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="490618c1-e6b6-4546-86ea-27cf18723a7a" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287286 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287302 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a725db70-608c-4a15-8d30-88bf5dbb764f" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287319 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5cb375-fe40-481c-a59e-a0f2ae2322bc" containerName="mariadb-account-create-update" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287336 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc43f91c-775d-4641-9192-53ddf96bd2b2" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287345 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4506b142-a95e-4cf3-a000-56fbee5e024d" containerName="glance-db-sync" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.287353 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="244ca852-a6d0-4537-8f87-b52b1237ff9b" containerName="mariadb-database-create" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.288417 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.298978 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-4jj7z"] Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.360558 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-dns-svc\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.360917 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.360951 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tggb\" (UniqueName: \"kubernetes.io/projected/2968d903-49aa-408e-bca1-984f529ea0ec-kube-api-access-5tggb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.361092 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.362774 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-config\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.466234 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-config\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.466975 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-dns-svc\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.467089 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.467121 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tggb\" (UniqueName: \"kubernetes.io/projected/2968d903-49aa-408e-bca1-984f529ea0ec-kube-api-access-5tggb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.467413 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.467915 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-dns-svc\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.468007 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.468287 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.468511 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-config\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.504460 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tggb\" (UniqueName: \"kubernetes.io/projected/2968d903-49aa-408e-bca1-984f529ea0ec-kube-api-access-5tggb\") pod \"dnsmasq-dns-6bfd654465-4jj7z\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:05 crc kubenswrapper[4823]: I1216 07:18:05.616771 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.135932 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-4jj7z"] Dec 16 07:18:06 crc kubenswrapper[4823]: W1216 07:18:06.137397 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2968d903_49aa_408e_bca1_984f529ea0ec.slice/crio-ec7d93987bc4b0756d4f9c4faa36068670b22a94b8ef94628bb265f1217bceea WatchSource:0}: Error finding container ec7d93987bc4b0756d4f9c4faa36068670b22a94b8ef94628bb265f1217bceea: Status 404 returned error can't find the container with id ec7d93987bc4b0756d4f9c4faa36068670b22a94b8ef94628bb265f1217bceea Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.163409 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"f87675dcfff9fc973b357762f0993278cb4dedf83d6ea269b8db0911d6c505df"} Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.163453 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"145ae0bd995a296d5194b205c5a110eae0cc0b53171f8ed6f7aab0a0e2c48aca"} Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.163467 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerStarted","Data":"fba9f42156608e6cc226456c4628eb8a6093a4e736f19553c3b609538523e305"} Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.168237 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" event={"ID":"2968d903-49aa-408e-bca1-984f529ea0ec","Type":"ContainerStarted","Data":"ec7d93987bc4b0756d4f9c4faa36068670b22a94b8ef94628bb265f1217bceea"} Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.210072 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.501517118 podStartE2EDuration="50.210050672s" podCreationTimestamp="2025-12-16 07:17:16 +0000 UTC" firstStartedPulling="2025-12-16 07:17:49.92860186 +0000 UTC m=+1348.417167983" lastFinishedPulling="2025-12-16 07:18:03.637135394 +0000 UTC m=+1362.125701537" observedRunningTime="2025-12-16 07:18:06.198336746 +0000 UTC m=+1364.686902869" watchObservedRunningTime="2025-12-16 07:18:06.210050672 +0000 UTC m=+1364.698616795" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.483906 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-4jj7z"] Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.519300 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-rtrnt"] Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.521552 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.526397 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.530185 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-rtrnt"] Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.695216 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.695432 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.695819 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-config\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.695885 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.695942 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjwlr\" (UniqueName: \"kubernetes.io/projected/c18e8ceb-4282-417a-a207-19e78c89c1af-kube-api-access-xjwlr\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.695974 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.798050 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-config\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.798422 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.798460 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjwlr\" (UniqueName: \"kubernetes.io/projected/c18e8ceb-4282-417a-a207-19e78c89c1af-kube-api-access-xjwlr\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.798492 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.798516 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.798568 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.799569 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.799620 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.799908 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.799972 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-config\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.800456 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.819144 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjwlr\" (UniqueName: \"kubernetes.io/projected/c18e8ceb-4282-417a-a207-19e78c89c1af-kube-api-access-xjwlr\") pod \"dnsmasq-dns-74dfc89d77-rtrnt\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:06 crc kubenswrapper[4823]: I1216 07:18:06.842886 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:07 crc kubenswrapper[4823]: I1216 07:18:07.177397 4823 generic.go:334] "Generic (PLEG): container finished" podID="2968d903-49aa-408e-bca1-984f529ea0ec" containerID="89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044" exitCode=0 Dec 16 07:18:07 crc kubenswrapper[4823]: I1216 07:18:07.177552 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" event={"ID":"2968d903-49aa-408e-bca1-984f529ea0ec","Type":"ContainerDied","Data":"89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044"} Dec 16 07:18:07 crc kubenswrapper[4823]: I1216 07:18:07.340175 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-rtrnt"] Dec 16 07:18:07 crc kubenswrapper[4823]: W1216 07:18:07.344481 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc18e8ceb_4282_417a_a207_19e78c89c1af.slice/crio-5755e3946ca18072a39960729d93e35d9621a4e1316265f78fdf5d9a1e33b583 WatchSource:0}: Error finding container 5755e3946ca18072a39960729d93e35d9621a4e1316265f78fdf5d9a1e33b583: Status 404 returned error can't find the container with id 5755e3946ca18072a39960729d93e35d9621a4e1316265f78fdf5d9a1e33b583 Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.196538 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" event={"ID":"2968d903-49aa-408e-bca1-984f529ea0ec","Type":"ContainerStarted","Data":"26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7"} Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.196599 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" containerName="dnsmasq-dns" containerID="cri-o://26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7" gracePeriod=10 Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.196638 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.198332 4823 generic.go:334] "Generic (PLEG): container finished" podID="dee4f17f-49e7-4f83-b138-a913f67757b3" containerID="cd1dc48ce8bf98695f893a5adecf3b9f44bbc3521a4cef478e87f157c2618181" exitCode=0 Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.198403 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2jnpg" event={"ID":"dee4f17f-49e7-4f83-b138-a913f67757b3","Type":"ContainerDied","Data":"cd1dc48ce8bf98695f893a5adecf3b9f44bbc3521a4cef478e87f157c2618181"} Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.204462 4823 generic.go:334] "Generic (PLEG): container finished" podID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerID="69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb" exitCode=0 Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.204513 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" event={"ID":"c18e8ceb-4282-417a-a207-19e78c89c1af","Type":"ContainerDied","Data":"69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb"} Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.204543 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" event={"ID":"c18e8ceb-4282-417a-a207-19e78c89c1af","Type":"ContainerStarted","Data":"5755e3946ca18072a39960729d93e35d9621a4e1316265f78fdf5d9a1e33b583"} Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.217436 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" podStartSLOduration=3.217417863 podStartE2EDuration="3.217417863s" podCreationTimestamp="2025-12-16 07:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:08.215467452 +0000 UTC m=+1366.704033585" watchObservedRunningTime="2025-12-16 07:18:08.217417863 +0000 UTC m=+1366.705983986" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.638822 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.728762 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tggb\" (UniqueName: \"kubernetes.io/projected/2968d903-49aa-408e-bca1-984f529ea0ec-kube-api-access-5tggb\") pod \"2968d903-49aa-408e-bca1-984f529ea0ec\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.728818 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-config\") pod \"2968d903-49aa-408e-bca1-984f529ea0ec\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.728873 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-sb\") pod \"2968d903-49aa-408e-bca1-984f529ea0ec\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.728937 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-dns-svc\") pod \"2968d903-49aa-408e-bca1-984f529ea0ec\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.729398 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-nb\") pod \"2968d903-49aa-408e-bca1-984f529ea0ec\" (UID: \"2968d903-49aa-408e-bca1-984f529ea0ec\") " Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.736057 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2968d903-49aa-408e-bca1-984f529ea0ec-kube-api-access-5tggb" (OuterVolumeSpecName: "kube-api-access-5tggb") pod "2968d903-49aa-408e-bca1-984f529ea0ec" (UID: "2968d903-49aa-408e-bca1-984f529ea0ec"). InnerVolumeSpecName "kube-api-access-5tggb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.781011 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2968d903-49aa-408e-bca1-984f529ea0ec" (UID: "2968d903-49aa-408e-bca1-984f529ea0ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.784417 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-config" (OuterVolumeSpecName: "config") pod "2968d903-49aa-408e-bca1-984f529ea0ec" (UID: "2968d903-49aa-408e-bca1-984f529ea0ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.785641 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2968d903-49aa-408e-bca1-984f529ea0ec" (UID: "2968d903-49aa-408e-bca1-984f529ea0ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.786501 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2968d903-49aa-408e-bca1-984f529ea0ec" (UID: "2968d903-49aa-408e-bca1-984f529ea0ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.831366 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.831412 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tggb\" (UniqueName: \"kubernetes.io/projected/2968d903-49aa-408e-bca1-984f529ea0ec-kube-api-access-5tggb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.831429 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.831442 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:08 crc kubenswrapper[4823]: I1216 07:18:08.831453 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2968d903-49aa-408e-bca1-984f529ea0ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.212574 4823 generic.go:334] "Generic (PLEG): container finished" podID="2968d903-49aa-408e-bca1-984f529ea0ec" containerID="26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7" exitCode=0 Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.212832 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.212779 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" event={"ID":"2968d903-49aa-408e-bca1-984f529ea0ec","Type":"ContainerDied","Data":"26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7"} Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.212885 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bfd654465-4jj7z" event={"ID":"2968d903-49aa-408e-bca1-984f529ea0ec","Type":"ContainerDied","Data":"ec7d93987bc4b0756d4f9c4faa36068670b22a94b8ef94628bb265f1217bceea"} Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.212907 4823 scope.go:117] "RemoveContainer" containerID="26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.224242 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" event={"ID":"c18e8ceb-4282-417a-a207-19e78c89c1af","Type":"ContainerStarted","Data":"9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397"} Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.236932 4823 scope.go:117] "RemoveContainer" containerID="89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.251011 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" podStartSLOduration=3.250991255 podStartE2EDuration="3.250991255s" podCreationTimestamp="2025-12-16 07:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:09.24254938 +0000 UTC m=+1367.731115513" watchObservedRunningTime="2025-12-16 07:18:09.250991255 +0000 UTC m=+1367.739557378" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.270134 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-4jj7z"] Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.282359 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-4jj7z"] Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.288058 4823 scope.go:117] "RemoveContainer" containerID="26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7" Dec 16 07:18:09 crc kubenswrapper[4823]: E1216 07:18:09.288466 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7\": container with ID starting with 26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7 not found: ID does not exist" containerID="26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.288518 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7"} err="failed to get container status \"26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7\": rpc error: code = NotFound desc = could not find container \"26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7\": container with ID starting with 26753200f05270d1a52d237d01009efc702028f3f75356304e6cbcdaf36198a7 not found: ID does not exist" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.288552 4823 scope.go:117] "RemoveContainer" containerID="89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044" Dec 16 07:18:09 crc kubenswrapper[4823]: E1216 07:18:09.288890 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044\": container with ID starting with 89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044 not found: ID does not exist" containerID="89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.288918 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044"} err="failed to get container status \"89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044\": rpc error: code = NotFound desc = could not find container \"89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044\": container with ID starting with 89b6a48c510ff948af9267aa48e62862d2381aaa15a567a4698272a8082ae044 not found: ID does not exist" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.516965 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.644953 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6tkh\" (UniqueName: \"kubernetes.io/projected/dee4f17f-49e7-4f83-b138-a913f67757b3-kube-api-access-f6tkh\") pod \"dee4f17f-49e7-4f83-b138-a913f67757b3\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.645015 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-config-data\") pod \"dee4f17f-49e7-4f83-b138-a913f67757b3\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.645144 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-combined-ca-bundle\") pod \"dee4f17f-49e7-4f83-b138-a913f67757b3\" (UID: \"dee4f17f-49e7-4f83-b138-a913f67757b3\") " Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.651295 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee4f17f-49e7-4f83-b138-a913f67757b3-kube-api-access-f6tkh" (OuterVolumeSpecName: "kube-api-access-f6tkh") pod "dee4f17f-49e7-4f83-b138-a913f67757b3" (UID: "dee4f17f-49e7-4f83-b138-a913f67757b3"). InnerVolumeSpecName "kube-api-access-f6tkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.668945 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dee4f17f-49e7-4f83-b138-a913f67757b3" (UID: "dee4f17f-49e7-4f83-b138-a913f67757b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.686327 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-config-data" (OuterVolumeSpecName: "config-data") pod "dee4f17f-49e7-4f83-b138-a913f67757b3" (UID: "dee4f17f-49e7-4f83-b138-a913f67757b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.747446 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6tkh\" (UniqueName: \"kubernetes.io/projected/dee4f17f-49e7-4f83-b138-a913f67757b3-kube-api-access-f6tkh\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.747489 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.747501 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee4f17f-49e7-4f83-b138-a913f67757b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:09 crc kubenswrapper[4823]: I1216 07:18:09.782560 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" path="/var/lib/kubelet/pods/2968d903-49aa-408e-bca1-984f529ea0ec/volumes" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.233998 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2jnpg" event={"ID":"dee4f17f-49e7-4f83-b138-a913f67757b3","Type":"ContainerDied","Data":"ae3505fc5d19a24ad5bee2b4fb305097018ea68221b125d537480db75e723878"} Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.234312 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3505fc5d19a24ad5bee2b4fb305097018ea68221b125d537480db75e723878" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.234107 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2jnpg" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.236538 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.519732 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-rtrnt"] Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.536056 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-msjl2"] Dec 16 07:18:10 crc kubenswrapper[4823]: E1216 07:18:10.536410 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee4f17f-49e7-4f83-b138-a913f67757b3" containerName="keystone-db-sync" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.536425 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee4f17f-49e7-4f83-b138-a913f67757b3" containerName="keystone-db-sync" Dec 16 07:18:10 crc kubenswrapper[4823]: E1216 07:18:10.536438 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" containerName="init" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.536444 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" containerName="init" Dec 16 07:18:10 crc kubenswrapper[4823]: E1216 07:18:10.536466 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" containerName="dnsmasq-dns" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.536472 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" containerName="dnsmasq-dns" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.536733 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2968d903-49aa-408e-bca1-984f529ea0ec" containerName="dnsmasq-dns" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.536754 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee4f17f-49e7-4f83-b138-a913f67757b3" containerName="keystone-db-sync" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.537305 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.547550 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.547759 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.547929 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.548071 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.548236 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wtcrq" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.572300 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-rq8ng"] Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.573825 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.584684 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-msjl2"] Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665035 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-credential-keys\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665086 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-scripts\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665127 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665167 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665225 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665250 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcbhf\" (UniqueName: \"kubernetes.io/projected/cb9d044a-5b65-468c-88a7-6338c76e3021-kube-api-access-tcbhf\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665312 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-config\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665360 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-fernet-keys\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665396 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fb7\" (UniqueName: \"kubernetes.io/projected/46e59506-a8dc-49b4-b1b9-d505eeeee126-kube-api-access-q8fb7\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665425 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-combined-ca-bundle\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665465 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-config-data\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.665493 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.692400 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-rq8ng"] Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.766945 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-scripts\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767015 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767077 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767124 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767145 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcbhf\" (UniqueName: \"kubernetes.io/projected/cb9d044a-5b65-468c-88a7-6338c76e3021-kube-api-access-tcbhf\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-config\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767209 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-fernet-keys\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767263 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fb7\" (UniqueName: \"kubernetes.io/projected/46e59506-a8dc-49b4-b1b9-d505eeeee126-kube-api-access-q8fb7\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767290 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-combined-ca-bundle\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767311 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-config-data\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767337 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.767408 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-credential-keys\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.774831 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-fernet-keys\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.775824 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.790409 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-credential-keys\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.794710 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.796065 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-combined-ca-bundle\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.796667 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-config-data\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.801152 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.801531 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-scripts\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.801865 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-config\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.804983 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.815677 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fb7\" (UniqueName: \"kubernetes.io/projected/46e59506-a8dc-49b4-b1b9-d505eeeee126-kube-api-access-q8fb7\") pod \"keystone-bootstrap-msjl2\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.843325 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcbhf\" (UniqueName: \"kubernetes.io/projected/cb9d044a-5b65-468c-88a7-6338c76e3021-kube-api-access-tcbhf\") pod \"dnsmasq-dns-5fdbfbc95f-rq8ng\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.866875 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.869272 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.872539 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.892696 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.940554 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-n2br8"] Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.942472 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.943103 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.945426 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jg2mt" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.946170 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.946203 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.973633 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-config-data\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.974610 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-run-httpd\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.974734 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4dkp\" (UniqueName: \"kubernetes.io/projected/fcd5e697-1360-4376-8160-ba0bc7fa56f8-kube-api-access-m4dkp\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.974869 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.974984 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.975105 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fcd5e697-1360-4376-8160-ba0bc7fa56f8-etc-machine-id\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.975277 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr9jj\" (UniqueName: \"kubernetes.io/projected/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-kube-api-access-qr9jj\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.987057 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-scripts\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.987263 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-combined-ca-bundle\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.987394 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-scripts\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.987547 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-config-data\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:10 crc kubenswrapper[4823]: I1216 07:18:10.987671 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.001709 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-db-sync-config-data\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.002082 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-log-httpd\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.002301 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hzvj8"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.004764 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.040401 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.040704 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-kcfc5" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.040867 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.072664 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n2br8"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.097135 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106040 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-config-data\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106103 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-db-sync-config-data\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106132 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-log-httpd\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106168 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-config-data\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106186 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-run-httpd\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106204 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4dkp\" (UniqueName: \"kubernetes.io/projected/fcd5e697-1360-4376-8160-ba0bc7fa56f8-kube-api-access-m4dkp\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106236 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106253 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106274 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-combined-ca-bundle\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106295 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fcd5e697-1360-4376-8160-ba0bc7fa56f8-etc-machine-id\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106320 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr9jj\" (UniqueName: \"kubernetes.io/projected/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-kube-api-access-qr9jj\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106339 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-scripts\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106370 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-config\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106387 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6s92\" (UniqueName: \"kubernetes.io/projected/21cc81af-96c8-4f21-85c5-07c7b9ade605-kube-api-access-f6s92\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106406 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-combined-ca-bundle\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.106429 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-scripts\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.107346 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-log-httpd\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.107684 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-run-httpd\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.112698 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-db-sync-config-data\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.112998 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-scripts\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.113223 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fcd5e697-1360-4376-8160-ba0bc7fa56f8-etc-machine-id\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.113574 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hzvj8"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.115689 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.116514 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.117634 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-combined-ca-bundle\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.120126 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-config-data\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.124271 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-scripts\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.134609 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-config-data\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.137682 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4dkp\" (UniqueName: \"kubernetes.io/projected/fcd5e697-1360-4376-8160-ba0bc7fa56f8-kube-api-access-m4dkp\") pod \"cinder-db-sync-n2br8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.149347 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-rq8ng"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.156809 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr9jj\" (UniqueName: \"kubernetes.io/projected/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-kube-api-access-qr9jj\") pod \"ceilometer-0\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.158559 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7mm88"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.160829 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.161552 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.169903 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.170193 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.170428 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-57pgn" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.171061 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-q69qd"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.190672 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-wxg9m"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.190920 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.193506 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7mm88"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.193601 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.193764 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.193991 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6gt6g" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.212338 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-wxg9m"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.215102 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-combined-ca-bundle\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.215284 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-config\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.215311 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6s92\" (UniqueName: \"kubernetes.io/projected/21cc81af-96c8-4f21-85c5-07c7b9ade605-kube-api-access-f6s92\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.220172 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-config\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.225324 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-combined-ca-bundle\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.228702 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q69qd"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.245342 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6s92\" (UniqueName: \"kubernetes.io/projected/21cc81af-96c8-4f21-85c5-07c7b9ade605-kube-api-access-f6s92\") pod \"neutron-db-sync-hzvj8\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.316908 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-scripts\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.317013 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.317093 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-combined-ca-bundle\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.317149 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-db-sync-config-data\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.317298 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.317806 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-combined-ca-bundle\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.317868 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldfz\" (UniqueName: \"kubernetes.io/projected/59c74f3a-8b4c-47eb-8d8d-af32e667d121-kube-api-access-lldfz\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318266 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvxj\" (UniqueName: \"kubernetes.io/projected/166784d2-df96-4ee8-a1a3-22a967bff610-kube-api-access-rzvxj\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318337 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d4gn\" (UniqueName: \"kubernetes.io/projected/34693374-b301-47b2-b909-b5b93fd96fd0-kube-api-access-5d4gn\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318358 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318580 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-config-data\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318614 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318671 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-config\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.318716 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34693374-b301-47b2-b909-b5b93fd96fd0-logs\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.415208 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421498 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421572 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-combined-ca-bundle\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421631 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-db-sync-config-data\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421672 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421710 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-combined-ca-bundle\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421762 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lldfz\" (UniqueName: \"kubernetes.io/projected/59c74f3a-8b4c-47eb-8d8d-af32e667d121-kube-api-access-lldfz\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421815 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvxj\" (UniqueName: \"kubernetes.io/projected/166784d2-df96-4ee8-a1a3-22a967bff610-kube-api-access-rzvxj\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421837 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d4gn\" (UniqueName: \"kubernetes.io/projected/34693374-b301-47b2-b909-b5b93fd96fd0-kube-api-access-5d4gn\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421864 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421951 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-config-data\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421972 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.421999 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-config\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.422068 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34693374-b301-47b2-b909-b5b93fd96fd0-logs\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.422104 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-scripts\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.423965 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.425688 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.426269 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-config\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.426991 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.427278 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34693374-b301-47b2-b909-b5b93fd96fd0-logs\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.427443 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.430652 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-scripts\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.430896 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-config-data\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.433351 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-db-sync-config-data\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.448007 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-combined-ca-bundle\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.449640 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-combined-ca-bundle\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.454628 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvxj\" (UniqueName: \"kubernetes.io/projected/166784d2-df96-4ee8-a1a3-22a967bff610-kube-api-access-rzvxj\") pod \"dnsmasq-dns-6f6f8cb849-wxg9m\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.455497 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldfz\" (UniqueName: \"kubernetes.io/projected/59c74f3a-8b4c-47eb-8d8d-af32e667d121-kube-api-access-lldfz\") pod \"barbican-db-sync-q69qd\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.457176 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d4gn\" (UniqueName: \"kubernetes.io/projected/34693374-b301-47b2-b909-b5b93fd96fd0-kube-api-access-5d4gn\") pod \"placement-db-sync-7mm88\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.476079 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.501510 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.519971 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.528375 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.569392 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-msjl2"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.677578 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.679827 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.686851 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.687076 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.687128 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8pjhw" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.687174 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.688833 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.727831 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728071 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728164 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728239 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728326 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728397 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-logs\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728493 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.728560 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg9nn\" (UniqueName: \"kubernetes.io/projected/9de2069a-57e3-4ef0-8206-35a2cad119c7-kube-api-access-dg9nn\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.759906 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.761684 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.764557 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.764739 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.770354 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.833387 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.833457 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.833493 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.833523 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.838343 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.838615 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.838816 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.838848 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.838886 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839459 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-logs\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839573 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839646 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4l52\" (UniqueName: \"kubernetes.io/projected/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-kube-api-access-h4l52\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839693 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839767 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-logs\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839837 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.839871 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg9nn\" (UniqueName: \"kubernetes.io/projected/9de2069a-57e3-4ef0-8206-35a2cad119c7-kube-api-access-dg9nn\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.840914 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-logs\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.841595 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.847939 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.864752 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.865243 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.865918 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.893928 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg9nn\" (UniqueName: \"kubernetes.io/projected/9de2069a-57e3-4ef0-8206-35a2cad119c7-kube-api-access-dg9nn\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.894744 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n2br8"] Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.896263 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.897998 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.914275 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-rq8ng"] Dec 16 07:18:11 crc kubenswrapper[4823]: W1216 07:18:11.917571 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46e59506_a8dc_49b4_b1b9_d505eeeee126.slice/crio-37a74a9eebba3b3d42c224009f1e9770203526e94e40efcb76661fef5bb3bff2 WatchSource:0}: Error finding container 37a74a9eebba3b3d42c224009f1e9770203526e94e40efcb76661fef5bb3bff2: Status 404 returned error can't find the container with id 37a74a9eebba3b3d42c224009f1e9770203526e94e40efcb76661fef5bb3bff2 Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.952071 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.952135 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.956219 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.956279 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4l52\" (UniqueName: \"kubernetes.io/projected/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-kube-api-access-h4l52\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.956320 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.956388 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-logs\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.956509 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.956619 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.957366 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.957687 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-logs\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.963424 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.969082 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.969318 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.973334 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.991378 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:11 crc kubenswrapper[4823]: I1216 07:18:11.998086 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4l52\" (UniqueName: \"kubernetes.io/projected/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-kube-api-access-h4l52\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.006945 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.066482 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.205787 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.267924 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-msjl2" event={"ID":"46e59506-a8dc-49b4-b1b9-d505eeeee126","Type":"ContainerStarted","Data":"37a74a9eebba3b3d42c224009f1e9770203526e94e40efcb76661fef5bb3bff2"} Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.271980 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" event={"ID":"cb9d044a-5b65-468c-88a7-6338c76e3021","Type":"ContainerStarted","Data":"0e208820aa4b0e01f84b1ab59bcaaa98f7894a0061d465acbb1fd493c3009574"} Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.275410 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerName="dnsmasq-dns" containerID="cri-o://9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397" gracePeriod=10 Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.275544 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2br8" event={"ID":"fcd5e697-1360-4376-8160-ba0bc7fa56f8","Type":"ContainerStarted","Data":"fbcfb626861a05972d3142b45ff6fa52e798cda73a2a2f309993c022e980d418"} Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.605126 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7mm88"] Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.618988 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.640607 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hzvj8"] Dec 16 07:18:12 crc kubenswrapper[4823]: W1216 07:18:12.654268 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21cc81af_96c8_4f21_85c5_07c7b9ade605.slice/crio-253a13f7be04fe99c2eb8daed173bb3ca9385725843b1241ef7ec78c5d5ff278 WatchSource:0}: Error finding container 253a13f7be04fe99c2eb8daed173bb3ca9385725843b1241ef7ec78c5d5ff278: Status 404 returned error can't find the container with id 253a13f7be04fe99c2eb8daed173bb3ca9385725843b1241ef7ec78c5d5ff278 Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.856677 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q69qd"] Dec 16 07:18:12 crc kubenswrapper[4823]: I1216 07:18:12.866342 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-wxg9m"] Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.088978 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.134369 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.227122 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:13 crc kubenswrapper[4823]: W1216 07:18:13.261609 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5e4bc29_92ee_49cb_b3c7_792d403f1afa.slice/crio-e5d6cae45512b499c2872ea7038831edd94f0211be753c26f8e98bbe694a9b10 WatchSource:0}: Error finding container e5d6cae45512b499c2872ea7038831edd94f0211be753c26f8e98bbe694a9b10: Status 404 returned error can't find the container with id e5d6cae45512b499c2872ea7038831edd94f0211be753c26f8e98bbe694a9b10 Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.291789 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-config\") pod \"c18e8ceb-4282-417a-a207-19e78c89c1af\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.292009 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-svc\") pod \"c18e8ceb-4282-417a-a207-19e78c89c1af\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.292104 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-nb\") pod \"c18e8ceb-4282-417a-a207-19e78c89c1af\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.292133 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjwlr\" (UniqueName: \"kubernetes.io/projected/c18e8ceb-4282-417a-a207-19e78c89c1af-kube-api-access-xjwlr\") pod \"c18e8ceb-4282-417a-a207-19e78c89c1af\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.292160 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-swift-storage-0\") pod \"c18e8ceb-4282-417a-a207-19e78c89c1af\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.292210 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-sb\") pod \"c18e8ceb-4282-417a-a207-19e78c89c1af\" (UID: \"c18e8ceb-4282-417a-a207-19e78c89c1af\") " Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.299820 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18e8ceb-4282-417a-a207-19e78c89c1af-kube-api-access-xjwlr" (OuterVolumeSpecName: "kube-api-access-xjwlr") pod "c18e8ceb-4282-417a-a207-19e78c89c1af" (UID: "c18e8ceb-4282-417a-a207-19e78c89c1af"). InnerVolumeSpecName "kube-api-access-xjwlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.300401 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5e4bc29-92ee-49cb-b3c7-792d403f1afa","Type":"ContainerStarted","Data":"e5d6cae45512b499c2872ea7038831edd94f0211be753c26f8e98bbe694a9b10"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.325788 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" event={"ID":"166784d2-df96-4ee8-a1a3-22a967bff610","Type":"ContainerStarted","Data":"4c4254e8613eac97614956581232adf031224db77cbd5566f427b617354b173f"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.325838 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" event={"ID":"166784d2-df96-4ee8-a1a3-22a967bff610","Type":"ContainerStarted","Data":"b84d1788d23315388f454377246223a6daabce45f9f84cd5e6fb1c9f2bb2087a"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.329328 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-msjl2" event={"ID":"46e59506-a8dc-49b4-b1b9-d505eeeee126","Type":"ContainerStarted","Data":"6c7499272eb96a0be4f830b28f2badc678a46e4891df1fbff674a8bf16f9dc6b"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.331133 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9de2069a-57e3-4ef0-8206-35a2cad119c7","Type":"ContainerStarted","Data":"62cb116a8da9ece34dcf4e86f9d102a11e975e44698a73580ce0dbc4b47d827a"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.334197 4823 generic.go:334] "Generic (PLEG): container finished" podID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerID="9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397" exitCode=0 Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.334272 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" event={"ID":"c18e8ceb-4282-417a-a207-19e78c89c1af","Type":"ContainerDied","Data":"9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.334302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" event={"ID":"c18e8ceb-4282-417a-a207-19e78c89c1af","Type":"ContainerDied","Data":"5755e3946ca18072a39960729d93e35d9621a4e1316265f78fdf5d9a1e33b583"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.334319 4823 scope.go:117] "RemoveContainer" containerID="9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.334620 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-rtrnt" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.340148 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hzvj8" event={"ID":"21cc81af-96c8-4f21-85c5-07c7b9ade605","Type":"ContainerStarted","Data":"a45c473f291f2511a351060d4ccb2b122a8889fc17f7f1e03231443022b74af9"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.340193 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hzvj8" event={"ID":"21cc81af-96c8-4f21-85c5-07c7b9ade605","Type":"ContainerStarted","Data":"253a13f7be04fe99c2eb8daed173bb3ca9385725843b1241ef7ec78c5d5ff278"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.349989 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c18e8ceb-4282-417a-a207-19e78c89c1af" (UID: "c18e8ceb-4282-417a-a207-19e78c89c1af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.354096 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mm88" event={"ID":"34693374-b301-47b2-b909-b5b93fd96fd0","Type":"ContainerStarted","Data":"88129a2199ae5e8da9ef90f6ca686ccfaf33226c9e85a68d24e7f99a929f5c40"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.357820 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-config" (OuterVolumeSpecName: "config") pod "c18e8ceb-4282-417a-a207-19e78c89c1af" (UID: "c18e8ceb-4282-417a-a207-19e78c89c1af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.359726 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q69qd" event={"ID":"59c74f3a-8b4c-47eb-8d8d-af32e667d121","Type":"ContainerStarted","Data":"ed78fb69cd2dd95f410ea4905a49bb5c04f60faefd3d37dca05b373efbcfb246"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.360282 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c18e8ceb-4282-417a-a207-19e78c89c1af" (UID: "c18e8ceb-4282-417a-a207-19e78c89c1af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.367634 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-msjl2" podStartSLOduration=3.367616488 podStartE2EDuration="3.367616488s" podCreationTimestamp="2025-12-16 07:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:13.362541848 +0000 UTC m=+1371.851107971" watchObservedRunningTime="2025-12-16 07:18:13.367616488 +0000 UTC m=+1371.856182611" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.367896 4823 generic.go:334] "Generic (PLEG): container finished" podID="cb9d044a-5b65-468c-88a7-6338c76e3021" containerID="f602521792920a2169930078f727631021f362b6dd3714a5de572238db821ce5" exitCode=0 Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.367954 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" event={"ID":"cb9d044a-5b65-468c-88a7-6338c76e3021","Type":"ContainerDied","Data":"f602521792920a2169930078f727631021f362b6dd3714a5de572238db821ce5"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.376183 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerStarted","Data":"d830021772bc6acdb57fc16639b21dc822f7b3e5addd734a34dd548a4cfce5b9"} Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.379187 4823 scope.go:117] "RemoveContainer" containerID="69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.396303 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjwlr\" (UniqueName: \"kubernetes.io/projected/c18e8ceb-4282-417a-a207-19e78c89c1af-kube-api-access-xjwlr\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.396337 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.396656 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.406320 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.406201 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hzvj8" podStartSLOduration=3.406179886 podStartE2EDuration="3.406179886s" podCreationTimestamp="2025-12-16 07:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:13.379290894 +0000 UTC m=+1371.867857017" watchObservedRunningTime="2025-12-16 07:18:13.406179886 +0000 UTC m=+1371.894746009" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.657011 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c18e8ceb-4282-417a-a207-19e78c89c1af" (UID: "c18e8ceb-4282-417a-a207-19e78c89c1af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.665164 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c18e8ceb-4282-417a-a207-19e78c89c1af" (UID: "c18e8ceb-4282-417a-a207-19e78c89c1af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.686896 4823 scope.go:117] "RemoveContainer" containerID="9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397" Dec 16 07:18:13 crc kubenswrapper[4823]: E1216 07:18:13.687337 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397\": container with ID starting with 9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397 not found: ID does not exist" containerID="9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.687363 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397"} err="failed to get container status \"9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397\": rpc error: code = NotFound desc = could not find container \"9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397\": container with ID starting with 9ae8973126f3c15b2cb0e0cebea03e45af2715b2924e20ccc8e2805f501fe397 not found: ID does not exist" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.687382 4823 scope.go:117] "RemoveContainer" containerID="69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb" Dec 16 07:18:13 crc kubenswrapper[4823]: E1216 07:18:13.687574 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb\": container with ID starting with 69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb not found: ID does not exist" containerID="69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.687591 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb"} err="failed to get container status \"69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb\": rpc error: code = NotFound desc = could not find container \"69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb\": container with ID starting with 69ef19b166f1aa3bf87453d32ff40974a0dbee2293781a4b289eddff9e6de9fb not found: ID does not exist" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.756360 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.756402 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18e8ceb-4282-417a-a207-19e78c89c1af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.805603 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.901210 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:13 crc kubenswrapper[4823]: I1216 07:18:13.971081 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.014229 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-rtrnt"] Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.031492 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-rtrnt"] Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.100428 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.265902 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-swift-storage-0\") pod \"cb9d044a-5b65-468c-88a7-6338c76e3021\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.265982 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcbhf\" (UniqueName: \"kubernetes.io/projected/cb9d044a-5b65-468c-88a7-6338c76e3021-kube-api-access-tcbhf\") pod \"cb9d044a-5b65-468c-88a7-6338c76e3021\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.266014 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-svc\") pod \"cb9d044a-5b65-468c-88a7-6338c76e3021\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.266163 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-config\") pod \"cb9d044a-5b65-468c-88a7-6338c76e3021\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.266386 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-sb\") pod \"cb9d044a-5b65-468c-88a7-6338c76e3021\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.266471 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-nb\") pod \"cb9d044a-5b65-468c-88a7-6338c76e3021\" (UID: \"cb9d044a-5b65-468c-88a7-6338c76e3021\") " Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.271863 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9d044a-5b65-468c-88a7-6338c76e3021-kube-api-access-tcbhf" (OuterVolumeSpecName: "kube-api-access-tcbhf") pod "cb9d044a-5b65-468c-88a7-6338c76e3021" (UID: "cb9d044a-5b65-468c-88a7-6338c76e3021"). InnerVolumeSpecName "kube-api-access-tcbhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.297122 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-config" (OuterVolumeSpecName: "config") pod "cb9d044a-5b65-468c-88a7-6338c76e3021" (UID: "cb9d044a-5b65-468c-88a7-6338c76e3021"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.299750 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb9d044a-5b65-468c-88a7-6338c76e3021" (UID: "cb9d044a-5b65-468c-88a7-6338c76e3021"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.311815 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb9d044a-5b65-468c-88a7-6338c76e3021" (UID: "cb9d044a-5b65-468c-88a7-6338c76e3021"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.312412 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cb9d044a-5b65-468c-88a7-6338c76e3021" (UID: "cb9d044a-5b65-468c-88a7-6338c76e3021"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.325621 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb9d044a-5b65-468c-88a7-6338c76e3021" (UID: "cb9d044a-5b65-468c-88a7-6338c76e3021"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.368113 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.368370 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.368380 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcbhf\" (UniqueName: \"kubernetes.io/projected/cb9d044a-5b65-468c-88a7-6338c76e3021-kube-api-access-tcbhf\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.370705 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.370743 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.370753 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb9d044a-5b65-468c-88a7-6338c76e3021-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.413497 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.414428 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-rq8ng" event={"ID":"cb9d044a-5b65-468c-88a7-6338c76e3021","Type":"ContainerDied","Data":"0e208820aa4b0e01f84b1ab59bcaaa98f7894a0061d465acbb1fd493c3009574"} Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.414487 4823 scope.go:117] "RemoveContainer" containerID="f602521792920a2169930078f727631021f362b6dd3714a5de572238db821ce5" Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.420493 4823 generic.go:334] "Generic (PLEG): container finished" podID="166784d2-df96-4ee8-a1a3-22a967bff610" containerID="4c4254e8613eac97614956581232adf031224db77cbd5566f427b617354b173f" exitCode=0 Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.420599 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" event={"ID":"166784d2-df96-4ee8-a1a3-22a967bff610","Type":"ContainerDied","Data":"4c4254e8613eac97614956581232adf031224db77cbd5566f427b617354b173f"} Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.559183 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-rq8ng"] Dec 16 07:18:14 crc kubenswrapper[4823]: I1216 07:18:14.571790 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-rq8ng"] Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.433441 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5e4bc29-92ee-49cb-b3c7-792d403f1afa","Type":"ContainerStarted","Data":"b6404971ffd3806b279fe73ea8a60a61baa7a8fbbe8d84fef6f66440c2a70b53"} Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.441099 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" event={"ID":"166784d2-df96-4ee8-a1a3-22a967bff610","Type":"ContainerStarted","Data":"915bf11a2d2e78e3982dc67bdcd3b8756575f7fc1659fb4dbf5ed5f2329a9e62"} Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.442642 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.447232 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9de2069a-57e3-4ef0-8206-35a2cad119c7","Type":"ContainerStarted","Data":"404b52af3ac618c4c677f8884891b92486537bbc5c3934501fd1e30ef876f5c2"} Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.478889 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" podStartSLOduration=4.478868533 podStartE2EDuration="4.478868533s" podCreationTimestamp="2025-12-16 07:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:15.474419933 +0000 UTC m=+1373.962986066" watchObservedRunningTime="2025-12-16 07:18:15.478868533 +0000 UTC m=+1373.967434656" Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.782787 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" path="/var/lib/kubelet/pods/c18e8ceb-4282-417a-a207-19e78c89c1af/volumes" Dec 16 07:18:15 crc kubenswrapper[4823]: I1216 07:18:15.783745 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9d044a-5b65-468c-88a7-6338c76e3021" path="/var/lib/kubelet/pods/cb9d044a-5b65-468c-88a7-6338c76e3021/volumes" Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.469764 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9de2069a-57e3-4ef0-8206-35a2cad119c7","Type":"ContainerStarted","Data":"dcfa7f9b2e73bc5854a2733f0706c4b72ed728ec5d5a8f799030622f297dcb08"} Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.469858 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-log" containerID="cri-o://404b52af3ac618c4c677f8884891b92486537bbc5c3934501fd1e30ef876f5c2" gracePeriod=30 Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.469912 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-httpd" containerID="cri-o://dcfa7f9b2e73bc5854a2733f0706c4b72ed728ec5d5a8f799030622f297dcb08" gracePeriod=30 Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.476401 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-log" containerID="cri-o://b6404971ffd3806b279fe73ea8a60a61baa7a8fbbe8d84fef6f66440c2a70b53" gracePeriod=30 Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.476571 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-httpd" containerID="cri-o://ada25730040dec0b01392bc063d16f90552718206b331be4ad1934eef2f28496" gracePeriod=30 Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.476649 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5e4bc29-92ee-49cb-b3c7-792d403f1afa","Type":"ContainerStarted","Data":"ada25730040dec0b01392bc063d16f90552718206b331be4ad1934eef2f28496"} Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.511792 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.511774603 podStartE2EDuration="6.511774603s" podCreationTimestamp="2025-12-16 07:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:16.497494866 +0000 UTC m=+1374.986060989" watchObservedRunningTime="2025-12-16 07:18:16.511774603 +0000 UTC m=+1375.000340726" Dec 16 07:18:16 crc kubenswrapper[4823]: I1216 07:18:16.530552 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.530537621 podStartE2EDuration="6.530537621s" podCreationTimestamp="2025-12-16 07:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:16.527806855 +0000 UTC m=+1375.016372978" watchObservedRunningTime="2025-12-16 07:18:16.530537621 +0000 UTC m=+1375.019103744" Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.495172 4823 generic.go:334] "Generic (PLEG): container finished" podID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerID="ada25730040dec0b01392bc063d16f90552718206b331be4ad1934eef2f28496" exitCode=0 Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.495420 4823 generic.go:334] "Generic (PLEG): container finished" podID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerID="b6404971ffd3806b279fe73ea8a60a61baa7a8fbbe8d84fef6f66440c2a70b53" exitCode=143 Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.495218 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5e4bc29-92ee-49cb-b3c7-792d403f1afa","Type":"ContainerDied","Data":"ada25730040dec0b01392bc063d16f90552718206b331be4ad1934eef2f28496"} Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.495476 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5e4bc29-92ee-49cb-b3c7-792d403f1afa","Type":"ContainerDied","Data":"b6404971ffd3806b279fe73ea8a60a61baa7a8fbbe8d84fef6f66440c2a70b53"} Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.497833 4823 generic.go:334] "Generic (PLEG): container finished" podID="46e59506-a8dc-49b4-b1b9-d505eeeee126" containerID="6c7499272eb96a0be4f830b28f2badc678a46e4891df1fbff674a8bf16f9dc6b" exitCode=0 Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.497870 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-msjl2" event={"ID":"46e59506-a8dc-49b4-b1b9-d505eeeee126","Type":"ContainerDied","Data":"6c7499272eb96a0be4f830b28f2badc678a46e4891df1fbff674a8bf16f9dc6b"} Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.501114 4823 generic.go:334] "Generic (PLEG): container finished" podID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerID="dcfa7f9b2e73bc5854a2733f0706c4b72ed728ec5d5a8f799030622f297dcb08" exitCode=0 Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.501134 4823 generic.go:334] "Generic (PLEG): container finished" podID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerID="404b52af3ac618c4c677f8884891b92486537bbc5c3934501fd1e30ef876f5c2" exitCode=143 Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.501799 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9de2069a-57e3-4ef0-8206-35a2cad119c7","Type":"ContainerDied","Data":"dcfa7f9b2e73bc5854a2733f0706c4b72ed728ec5d5a8f799030622f297dcb08"} Dec 16 07:18:17 crc kubenswrapper[4823]: I1216 07:18:17.501824 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9de2069a-57e3-4ef0-8206-35a2cad119c7","Type":"ContainerDied","Data":"404b52af3ac618c4c677f8884891b92486537bbc5c3934501fd1e30ef876f5c2"} Dec 16 07:18:19 crc kubenswrapper[4823]: I1216 07:18:19.936877 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.007435 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fb7\" (UniqueName: \"kubernetes.io/projected/46e59506-a8dc-49b4-b1b9-d505eeeee126-kube-api-access-q8fb7\") pod \"46e59506-a8dc-49b4-b1b9-d505eeeee126\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.007508 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-config-data\") pod \"46e59506-a8dc-49b4-b1b9-d505eeeee126\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.007587 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-scripts\") pod \"46e59506-a8dc-49b4-b1b9-d505eeeee126\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.007662 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-fernet-keys\") pod \"46e59506-a8dc-49b4-b1b9-d505eeeee126\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.007695 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-credential-keys\") pod \"46e59506-a8dc-49b4-b1b9-d505eeeee126\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.007738 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-combined-ca-bundle\") pod \"46e59506-a8dc-49b4-b1b9-d505eeeee126\" (UID: \"46e59506-a8dc-49b4-b1b9-d505eeeee126\") " Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.014636 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "46e59506-a8dc-49b4-b1b9-d505eeeee126" (UID: "46e59506-a8dc-49b4-b1b9-d505eeeee126"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.014690 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-scripts" (OuterVolumeSpecName: "scripts") pod "46e59506-a8dc-49b4-b1b9-d505eeeee126" (UID: "46e59506-a8dc-49b4-b1b9-d505eeeee126"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.017061 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e59506-a8dc-49b4-b1b9-d505eeeee126-kube-api-access-q8fb7" (OuterVolumeSpecName: "kube-api-access-q8fb7") pod "46e59506-a8dc-49b4-b1b9-d505eeeee126" (UID: "46e59506-a8dc-49b4-b1b9-d505eeeee126"). InnerVolumeSpecName "kube-api-access-q8fb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.019835 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "46e59506-a8dc-49b4-b1b9-d505eeeee126" (UID: "46e59506-a8dc-49b4-b1b9-d505eeeee126"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.040929 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46e59506-a8dc-49b4-b1b9-d505eeeee126" (UID: "46e59506-a8dc-49b4-b1b9-d505eeeee126"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.065116 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-config-data" (OuterVolumeSpecName: "config-data") pod "46e59506-a8dc-49b4-b1b9-d505eeeee126" (UID: "46e59506-a8dc-49b4-b1b9-d505eeeee126"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.116220 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.116264 4823 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.116274 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.116288 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fb7\" (UniqueName: \"kubernetes.io/projected/46e59506-a8dc-49b4-b1b9-d505eeeee126-kube-api-access-q8fb7\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.116299 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.116308 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e59506-a8dc-49b4-b1b9-d505eeeee126-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.528609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-msjl2" event={"ID":"46e59506-a8dc-49b4-b1b9-d505eeeee126","Type":"ContainerDied","Data":"37a74a9eebba3b3d42c224009f1e9770203526e94e40efcb76661fef5bb3bff2"} Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.528647 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a74a9eebba3b3d42c224009f1e9770203526e94e40efcb76661fef5bb3bff2" Dec 16 07:18:20 crc kubenswrapper[4823]: I1216 07:18:20.528699 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-msjl2" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.023086 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-msjl2"] Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.031236 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-msjl2"] Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.119843 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-np6b5"] Dec 16 07:18:21 crc kubenswrapper[4823]: E1216 07:18:21.120539 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e59506-a8dc-49b4-b1b9-d505eeeee126" containerName="keystone-bootstrap" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120558 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e59506-a8dc-49b4-b1b9-d505eeeee126" containerName="keystone-bootstrap" Dec 16 07:18:21 crc kubenswrapper[4823]: E1216 07:18:21.120586 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9d044a-5b65-468c-88a7-6338c76e3021" containerName="init" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120595 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9d044a-5b65-468c-88a7-6338c76e3021" containerName="init" Dec 16 07:18:21 crc kubenswrapper[4823]: E1216 07:18:21.120608 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerName="dnsmasq-dns" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120615 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerName="dnsmasq-dns" Dec 16 07:18:21 crc kubenswrapper[4823]: E1216 07:18:21.120627 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerName="init" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120632 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerName="init" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120808 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9d044a-5b65-468c-88a7-6338c76e3021" containerName="init" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120820 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18e8ceb-4282-417a-a207-19e78c89c1af" containerName="dnsmasq-dns" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.120832 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e59506-a8dc-49b4-b1b9-d505eeeee126" containerName="keystone-bootstrap" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.121469 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.127095 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.127240 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.127255 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.127666 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wtcrq" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.127955 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.128158 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-np6b5"] Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.241936 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-combined-ca-bundle\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.242043 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-credential-keys\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.242085 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-fernet-keys\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.242112 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-scripts\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.242351 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-config-data\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.242683 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxk85\" (UniqueName: \"kubernetes.io/projected/9dca6476-18f2-4c1a-8c95-e894c5f9facd-kube-api-access-wxk85\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.344498 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxk85\" (UniqueName: \"kubernetes.io/projected/9dca6476-18f2-4c1a-8c95-e894c5f9facd-kube-api-access-wxk85\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.344587 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-combined-ca-bundle\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.344670 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-credential-keys\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.344707 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-fernet-keys\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.344733 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-scripts\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.344771 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-config-data\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.350347 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-credential-keys\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.350723 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-combined-ca-bundle\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.350801 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-fernet-keys\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.352230 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-scripts\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.358689 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-config-data\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.363162 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxk85\" (UniqueName: \"kubernetes.io/projected/9dca6476-18f2-4c1a-8c95-e894c5f9facd-kube-api-access-wxk85\") pod \"keystone-bootstrap-np6b5\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.455425 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.536236 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.625281 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-x4fkj"] Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.625510 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="dnsmasq-dns" containerID="cri-o://7b7c792a68d4e76b92c312443198e54881b3f7fc18a5ddf4d23f57980f79af89" gracePeriod=10 Dec 16 07:18:21 crc kubenswrapper[4823]: I1216 07:18:21.800649 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e59506-a8dc-49b4-b1b9-d505eeeee126" path="/var/lib/kubelet/pods/46e59506-a8dc-49b4-b1b9-d505eeeee126/volumes" Dec 16 07:18:22 crc kubenswrapper[4823]: I1216 07:18:22.573668 4823 generic.go:334] "Generic (PLEG): container finished" podID="5282b108-1519-455e-b112-ad707af48a9f" containerID="7b7c792a68d4e76b92c312443198e54881b3f7fc18a5ddf4d23f57980f79af89" exitCode=0 Dec 16 07:18:22 crc kubenswrapper[4823]: I1216 07:18:22.573742 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" event={"ID":"5282b108-1519-455e-b112-ad707af48a9f","Type":"ContainerDied","Data":"7b7c792a68d4e76b92c312443198e54881b3f7fc18a5ddf4d23f57980f79af89"} Dec 16 07:18:31 crc kubenswrapper[4823]: I1216 07:18:31.524837 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: i/o timeout" Dec 16 07:18:33 crc kubenswrapper[4823]: E1216 07:18:33.157568 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Dec 16 07:18:33 crc kubenswrapper[4823]: E1216 07:18:33.158269 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lldfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-q69qd_openstack(59c74f3a-8b4c-47eb-8d8d-af32e667d121): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:18:33 crc kubenswrapper[4823]: E1216 07:18:33.159464 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-q69qd" podUID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" Dec 16 07:18:33 crc kubenswrapper[4823]: E1216 07:18:33.706772 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-q69qd" podUID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.136197 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.136481 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4dkp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-n2br8_openstack(fcd5e697-1360-4376-8160-ba0bc7fa56f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.138301 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-n2br8" podUID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.210147 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.222592 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322541 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4l52\" (UniqueName: \"kubernetes.io/projected/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-kube-api-access-h4l52\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322592 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg9nn\" (UniqueName: \"kubernetes.io/projected/9de2069a-57e3-4ef0-8206-35a2cad119c7-kube-api-access-dg9nn\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322621 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-logs\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322719 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-combined-ca-bundle\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322743 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-scripts\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322783 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-public-tls-certs\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322827 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-logs\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322905 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-httpd-run\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322943 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-combined-ca-bundle\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322968 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-httpd-run\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.322996 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-scripts\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.324071 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326303 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326369 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326413 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-config-data\") pod \"9de2069a-57e3-4ef0-8206-35a2cad119c7\" (UID: \"9de2069a-57e3-4ef0-8206-35a2cad119c7\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326441 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-config-data\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326488 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-internal-tls-certs\") pod \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\" (UID: \"c5e4bc29-92ee-49cb-b3c7-792d403f1afa\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326522 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-logs" (OuterVolumeSpecName: "logs") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.326689 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-logs" (OuterVolumeSpecName: "logs") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.327154 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.327178 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.327190 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9de2069a-57e3-4ef0-8206-35a2cad119c7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.327215 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.386402 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-scripts" (OuterVolumeSpecName: "scripts") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.386418 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de2069a-57e3-4ef0-8206-35a2cad119c7-kube-api-access-dg9nn" (OuterVolumeSpecName: "kube-api-access-dg9nn") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "kube-api-access-dg9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.386530 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-kube-api-access-h4l52" (OuterVolumeSpecName: "kube-api-access-h4l52") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "kube-api-access-h4l52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.386566 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-scripts" (OuterVolumeSpecName: "scripts") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.386585 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.387490 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.409402 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.427052 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428522 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428637 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428700 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428778 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428857 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428914 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4l52\" (UniqueName: \"kubernetes.io/projected/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-kube-api-access-h4l52\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.428973 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg9nn\" (UniqueName: \"kubernetes.io/projected/9de2069a-57e3-4ef0-8206-35a2cad119c7-kube-api-access-dg9nn\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.429112 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.429220 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.446846 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-config-data" (OuterVolumeSpecName: "config-data") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.451791 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-config-data" (OuterVolumeSpecName: "config-data") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.453386 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.457372 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c5e4bc29-92ee-49cb-b3c7-792d403f1afa" (UID: "c5e4bc29-92ee-49cb-b3c7-792d403f1afa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.468788 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9de2069a-57e3-4ef0-8206-35a2cad119c7" (UID: "9de2069a-57e3-4ef0-8206-35a2cad119c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.470134 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.531540 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.531586 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.531600 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.531612 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9de2069a-57e3-4ef0-8206-35a2cad119c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.531624 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.531636 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5e4bc29-92ee-49cb-b3c7-792d403f1afa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.679231 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.679961 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n4h5f5h55h87h59dh7bh67ch5f6h65ch578h648h5f9h88h55fh5fbh669hdch5f7h66bh578h5c8hf7h75h66dh5cch55dh78hf7h656h57h64ch5b7q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qr9jj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(22c10a9c-6dba-4d35-a0d8-2ef0b82352cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.713615 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.715413 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.715406 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9de2069a-57e3-4ef0-8206-35a2cad119c7","Type":"ContainerDied","Data":"62cb116a8da9ece34dcf4e86f9d102a11e975e44698a73580ce0dbc4b47d827a"} Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.715499 4823 scope.go:117] "RemoveContainer" containerID="dcfa7f9b2e73bc5854a2733f0706c4b72ed728ec5d5a8f799030622f297dcb08" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.721667 4823 generic.go:334] "Generic (PLEG): container finished" podID="21cc81af-96c8-4f21-85c5-07c7b9ade605" containerID="a45c473f291f2511a351060d4ccb2b122a8889fc17f7f1e03231443022b74af9" exitCode=0 Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.721745 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hzvj8" event={"ID":"21cc81af-96c8-4f21-85c5-07c7b9ade605","Type":"ContainerDied","Data":"a45c473f291f2511a351060d4ccb2b122a8889fc17f7f1e03231443022b74af9"} Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.723987 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" event={"ID":"5282b108-1519-455e-b112-ad707af48a9f","Type":"ContainerDied","Data":"f709cd387b90c849459cf29f098990224eee24c85f315471bbe11e4fbdd027e8"} Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.724640 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.726760 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.727357 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5e4bc29-92ee-49cb-b3c7-792d403f1afa","Type":"ContainerDied","Data":"e5d6cae45512b499c2872ea7038831edd94f0211be753c26f8e98bbe694a9b10"} Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.727845 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-n2br8" podUID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.777220 4823 scope.go:117] "RemoveContainer" containerID="404b52af3ac618c4c677f8884891b92486537bbc5c3934501fd1e30ef876f5c2" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.835320 4823 scope.go:117] "RemoveContainer" containerID="7b7c792a68d4e76b92c312443198e54881b3f7fc18a5ddf4d23f57980f79af89" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.835544 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.838214 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-sb\") pod \"5282b108-1519-455e-b112-ad707af48a9f\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.838583 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2728\" (UniqueName: \"kubernetes.io/projected/5282b108-1519-455e-b112-ad707af48a9f-kube-api-access-q2728\") pod \"5282b108-1519-455e-b112-ad707af48a9f\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.839456 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-dns-svc\") pod \"5282b108-1519-455e-b112-ad707af48a9f\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.850437 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-config\") pod \"5282b108-1519-455e-b112-ad707af48a9f\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.850645 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-nb\") pod \"5282b108-1519-455e-b112-ad707af48a9f\" (UID: \"5282b108-1519-455e-b112-ad707af48a9f\") " Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.858699 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5282b108-1519-455e-b112-ad707af48a9f-kube-api-access-q2728" (OuterVolumeSpecName: "kube-api-access-q2728") pod "5282b108-1519-455e-b112-ad707af48a9f" (UID: "5282b108-1519-455e-b112-ad707af48a9f"). InnerVolumeSpecName "kube-api-access-q2728". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.867017 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2728\" (UniqueName: \"kubernetes.io/projected/5282b108-1519-455e-b112-ad707af48a9f-kube-api-access-q2728\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.883992 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.909793 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.911704 4823 scope.go:117] "RemoveContainer" containerID="6ee1a67603123534c21fbda13fcc80a4d47b24f3d7820be62d68dacfc183a448" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.918483 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5282b108-1519-455e-b112-ad707af48a9f" (UID: "5282b108-1519-455e-b112-ad707af48a9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.923679 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5282b108-1519-455e-b112-ad707af48a9f" (UID: "5282b108-1519-455e-b112-ad707af48a9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.932720 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-config" (OuterVolumeSpecName: "config") pod "5282b108-1519-455e-b112-ad707af48a9f" (UID: "5282b108-1519-455e-b112-ad707af48a9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.932954 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.939963 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5282b108-1519-455e-b112-ad707af48a9f" (UID: "5282b108-1519-455e-b112-ad707af48a9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.955933 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.956343 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-log" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956357 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-log" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.956375 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-httpd" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956381 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-httpd" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.956396 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-log" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956402 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-log" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.956408 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-httpd" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956414 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-httpd" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.956437 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="dnsmasq-dns" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956442 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="dnsmasq-dns" Dec 16 07:18:34 crc kubenswrapper[4823]: E1216 07:18:34.956450 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="init" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956456 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="init" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956609 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-log" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956624 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-httpd" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956635 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" containerName="glance-httpd" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956646 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="dnsmasq-dns" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.956657 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" containerName="glance-log" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.957624 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.963276 4823 scope.go:117] "RemoveContainer" containerID="ada25730040dec0b01392bc063d16f90552718206b331be4ad1934eef2f28496" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.964225 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.964864 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8pjhw" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.964928 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.964983 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.970289 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.970310 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.970320 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.970329 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5282b108-1519-455e-b112-ad707af48a9f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:34 crc kubenswrapper[4823]: I1216 07:18:34.983699 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.019281 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.023682 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.027387 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.031872 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.031891 4823 scope.go:117] "RemoveContainer" containerID="b6404971ffd3806b279fe73ea8a60a61baa7a8fbbe8d84fef6f66440c2a70b53" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.059262 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072057 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072142 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg5ss\" (UniqueName: \"kubernetes.io/projected/efa3cd8b-aa5f-4769-a8aa-801716fa389c-kube-api-access-jg5ss\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072170 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072193 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072259 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-config-data\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072278 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072303 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-scripts\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.072324 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-logs\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.126504 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-x4fkj"] Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.139387 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-x4fkj"] Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174304 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174348 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174378 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-config-data\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174399 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174427 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-scripts\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174450 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-logs\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174479 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4ls\" (UniqueName: \"kubernetes.io/projected/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-kube-api-access-nz4ls\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174504 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-logs\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174535 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174562 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174590 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174628 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174676 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg5ss\" (UniqueName: \"kubernetes.io/projected/efa3cd8b-aa5f-4769-a8aa-801716fa389c-kube-api-access-jg5ss\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174704 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174726 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.174749 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.175068 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.178199 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-logs\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.178644 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.180499 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-config-data\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.182792 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.184921 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-scripts\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.187501 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.195784 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg5ss\" (UniqueName: \"kubernetes.io/projected/efa3cd8b-aa5f-4769-a8aa-801716fa389c-kube-api-access-jg5ss\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.208665 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276441 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4ls\" (UniqueName: \"kubernetes.io/projected/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-kube-api-access-nz4ls\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276526 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-logs\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276567 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276597 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276633 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276714 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276796 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.276822 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.277467 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.278002 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-logs\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.280767 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.286736 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.286817 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.287313 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.291389 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.299864 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-np6b5"] Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.304873 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.307504 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.320105 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4ls\" (UniqueName: \"kubernetes.io/projected/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-kube-api-access-nz4ls\") pod \"glance-default-internal-api-0\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: W1216 07:18:35.328788 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dca6476_18f2_4c1a_8c95_e894c5f9facd.slice/crio-c637a3efacc029888e5e245c83a83ee6e3f12bf8f08992e439e061e4db9076eb WatchSource:0}: Error finding container c637a3efacc029888e5e245c83a83ee6e3f12bf8f08992e439e061e4db9076eb: Status 404 returned error can't find the container with id c637a3efacc029888e5e245c83a83ee6e3f12bf8f08992e439e061e4db9076eb Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.356205 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.364079 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.741989 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-np6b5" event={"ID":"9dca6476-18f2-4c1a-8c95-e894c5f9facd","Type":"ContainerStarted","Data":"0697000ce67a8cd70a4f5cf3424d1c8f80091a2488379262811ac7ef93a7f556"} Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.742044 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-np6b5" event={"ID":"9dca6476-18f2-4c1a-8c95-e894c5f9facd","Type":"ContainerStarted","Data":"c637a3efacc029888e5e245c83a83ee6e3f12bf8f08992e439e061e4db9076eb"} Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.754125 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mm88" event={"ID":"34693374-b301-47b2-b909-b5b93fd96fd0","Type":"ContainerStarted","Data":"1b71cb799085b8870997294e24797bb13ae088e514d562be9045f395f4dd9211"} Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.775882 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-np6b5" podStartSLOduration=14.775861079 podStartE2EDuration="14.775861079s" podCreationTimestamp="2025-12-16 07:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:35.767102054 +0000 UTC m=+1394.255668177" watchObservedRunningTime="2025-12-16 07:18:35.775861079 +0000 UTC m=+1394.264427202" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.790860 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7mm88" podStartSLOduration=2.642312752 podStartE2EDuration="24.790840267s" podCreationTimestamp="2025-12-16 07:18:11 +0000 UTC" firstStartedPulling="2025-12-16 07:18:12.619757655 +0000 UTC m=+1371.108323778" lastFinishedPulling="2025-12-16 07:18:34.76828515 +0000 UTC m=+1393.256851293" observedRunningTime="2025-12-16 07:18:35.788765853 +0000 UTC m=+1394.277331976" watchObservedRunningTime="2025-12-16 07:18:35.790840267 +0000 UTC m=+1394.279406380" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.807062 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5282b108-1519-455e-b112-ad707af48a9f" path="/var/lib/kubelet/pods/5282b108-1519-455e-b112-ad707af48a9f/volumes" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.807892 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de2069a-57e3-4ef0-8206-35a2cad119c7" path="/var/lib/kubelet/pods/9de2069a-57e3-4ef0-8206-35a2cad119c7/volumes" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.809017 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e4bc29-92ee-49cb-b3c7-792d403f1afa" path="/var/lib/kubelet/pods/c5e4bc29-92ee-49cb-b3c7-792d403f1afa/volumes" Dec 16 07:18:35 crc kubenswrapper[4823]: I1216 07:18:35.929248 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.043461 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:18:36 crc kubenswrapper[4823]: W1216 07:18:36.046950 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf85fada8_dce0_4e2b_83f5_ebf4f6fe2c80.slice/crio-97576eb822d67e73d6511817b900348abb68bbe0055fbd5f9f05edb9efa1d245 WatchSource:0}: Error finding container 97576eb822d67e73d6511817b900348abb68bbe0055fbd5f9f05edb9efa1d245: Status 404 returned error can't find the container with id 97576eb822d67e73d6511817b900348abb68bbe0055fbd5f9f05edb9efa1d245 Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.185936 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.302683 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-config\") pod \"21cc81af-96c8-4f21-85c5-07c7b9ade605\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.302809 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6s92\" (UniqueName: \"kubernetes.io/projected/21cc81af-96c8-4f21-85c5-07c7b9ade605-kube-api-access-f6s92\") pod \"21cc81af-96c8-4f21-85c5-07c7b9ade605\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.302880 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-combined-ca-bundle\") pod \"21cc81af-96c8-4f21-85c5-07c7b9ade605\" (UID: \"21cc81af-96c8-4f21-85c5-07c7b9ade605\") " Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.308657 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cc81af-96c8-4f21-85c5-07c7b9ade605-kube-api-access-f6s92" (OuterVolumeSpecName: "kube-api-access-f6s92") pod "21cc81af-96c8-4f21-85c5-07c7b9ade605" (UID: "21cc81af-96c8-4f21-85c5-07c7b9ade605"). InnerVolumeSpecName "kube-api-access-f6s92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.333433 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-config" (OuterVolumeSpecName: "config") pod "21cc81af-96c8-4f21-85c5-07c7b9ade605" (UID: "21cc81af-96c8-4f21-85c5-07c7b9ade605"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.338088 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21cc81af-96c8-4f21-85c5-07c7b9ade605" (UID: "21cc81af-96c8-4f21-85c5-07c7b9ade605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.405677 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.405750 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6s92\" (UniqueName: \"kubernetes.io/projected/21cc81af-96c8-4f21-85c5-07c7b9ade605-kube-api-access-f6s92\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.406300 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21cc81af-96c8-4f21-85c5-07c7b9ade605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.526703 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67fdf7998c-x4fkj" podUID="5282b108-1519-455e-b112-ad707af48a9f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: i/o timeout" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.762329 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"efa3cd8b-aa5f-4769-a8aa-801716fa389c","Type":"ContainerStarted","Data":"1b8beada80ec38510530fbd0b46a6f089e5e02413dd039fc19808fe329678851"} Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.763331 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80","Type":"ContainerStarted","Data":"97576eb822d67e73d6511817b900348abb68bbe0055fbd5f9f05edb9efa1d245"} Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.765720 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hzvj8" Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.765824 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hzvj8" event={"ID":"21cc81af-96c8-4f21-85c5-07c7b9ade605","Type":"ContainerDied","Data":"253a13f7be04fe99c2eb8daed173bb3ca9385725843b1241ef7ec78c5d5ff278"} Dec 16 07:18:36 crc kubenswrapper[4823]: I1216 07:18:36.765846 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253a13f7be04fe99c2eb8daed173bb3ca9385725843b1241ef7ec78c5d5ff278" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.075438 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685444497c-mb8qw"] Dec 16 07:18:37 crc kubenswrapper[4823]: E1216 07:18:37.075911 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc81af-96c8-4f21-85c5-07c7b9ade605" containerName="neutron-db-sync" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.075935 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc81af-96c8-4f21-85c5-07c7b9ade605" containerName="neutron-db-sync" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.076189 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cc81af-96c8-4f21-85c5-07c7b9ade605" containerName="neutron-db-sync" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.077376 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.084034 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685444497c-mb8qw"] Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.114980 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-765f8bc948-dqt65"] Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.116329 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.139289 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.139502 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.139619 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.161135 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-kcfc5" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.180501 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-765f8bc948-dqt65"] Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.220815 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-config\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.220907 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dzc\" (UniqueName: \"kubernetes.io/projected/ebefd0b6-7523-402f-8952-76a232986c74-kube-api-access-s7dzc\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.220959 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221058 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-config\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221143 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221380 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-ovndb-tls-certs\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221423 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-svc\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221593 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221618 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-httpd-config\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221696 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796k7\" (UniqueName: \"kubernetes.io/projected/c0d41fd9-3e78-4e48-ba89-6acc88459df8-kube-api-access-796k7\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.221726 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-combined-ca-bundle\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.322993 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-ovndb-tls-certs\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323084 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-svc\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323144 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-httpd-config\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323164 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323196 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796k7\" (UniqueName: \"kubernetes.io/projected/c0d41fd9-3e78-4e48-ba89-6acc88459df8-kube-api-access-796k7\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323214 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-combined-ca-bundle\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323248 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-config\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323267 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dzc\" (UniqueName: \"kubernetes.io/projected/ebefd0b6-7523-402f-8952-76a232986c74-kube-api-access-s7dzc\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323288 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323306 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-config\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.323333 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.324367 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-sb\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.326984 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-nb\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.326996 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-config\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.327270 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-svc\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.327705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-swift-storage-0\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.330482 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-combined-ca-bundle\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.332348 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-config\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.334040 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-httpd-config\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.337423 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-ovndb-tls-certs\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.355098 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796k7\" (UniqueName: \"kubernetes.io/projected/c0d41fd9-3e78-4e48-ba89-6acc88459df8-kube-api-access-796k7\") pod \"dnsmasq-dns-685444497c-mb8qw\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.361899 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dzc\" (UniqueName: \"kubernetes.io/projected/ebefd0b6-7523-402f-8952-76a232986c74-kube-api-access-s7dzc\") pod \"neutron-765f8bc948-dqt65\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.393801 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.440711 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.782680 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"efa3cd8b-aa5f-4769-a8aa-801716fa389c","Type":"ContainerStarted","Data":"83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a"} Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.784372 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80","Type":"ContainerStarted","Data":"0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4"} Dec 16 07:18:37 crc kubenswrapper[4823]: I1216 07:18:37.969549 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685444497c-mb8qw"] Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.137910 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-765f8bc948-dqt65"] Dec 16 07:18:38 crc kubenswrapper[4823]: W1216 07:18:38.147765 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebefd0b6_7523_402f_8952_76a232986c74.slice/crio-2520c01c87155de5f3e9ff16bc69a0c668d788a5f4eeb5e9a10be09df6651154 WatchSource:0}: Error finding container 2520c01c87155de5f3e9ff16bc69a0c668d788a5f4eeb5e9a10be09df6651154: Status 404 returned error can't find the container with id 2520c01c87155de5f3e9ff16bc69a0c668d788a5f4eeb5e9a10be09df6651154 Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.837996 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765f8bc948-dqt65" event={"ID":"ebefd0b6-7523-402f-8952-76a232986c74","Type":"ContainerStarted","Data":"7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.838748 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765f8bc948-dqt65" event={"ID":"ebefd0b6-7523-402f-8952-76a232986c74","Type":"ContainerStarted","Data":"3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.838767 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765f8bc948-dqt65" event={"ID":"ebefd0b6-7523-402f-8952-76a232986c74","Type":"ContainerStarted","Data":"2520c01c87155de5f3e9ff16bc69a0c668d788a5f4eeb5e9a10be09df6651154"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.838828 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.853405 4823 generic.go:334] "Generic (PLEG): container finished" podID="34693374-b301-47b2-b909-b5b93fd96fd0" containerID="1b71cb799085b8870997294e24797bb13ae088e514d562be9045f395f4dd9211" exitCode=0 Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.853509 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mm88" event={"ID":"34693374-b301-47b2-b909-b5b93fd96fd0","Type":"ContainerDied","Data":"1b71cb799085b8870997294e24797bb13ae088e514d562be9045f395f4dd9211"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.874079 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"efa3cd8b-aa5f-4769-a8aa-801716fa389c","Type":"ContainerStarted","Data":"073a08c446fb9875f9f26912e0877ed5083484ef1b445236e4f1bb03cdf07728"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.875486 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-765f8bc948-dqt65" podStartSLOduration=1.875465978 podStartE2EDuration="1.875465978s" podCreationTimestamp="2025-12-16 07:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:38.870773932 +0000 UTC m=+1397.359340055" watchObservedRunningTime="2025-12-16 07:18:38.875465978 +0000 UTC m=+1397.364032101" Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.888820 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerStarted","Data":"e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.912232 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80","Type":"ContainerStarted","Data":"f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.914253 4823 generic.go:334] "Generic (PLEG): container finished" podID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerID="fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001" exitCode=0 Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.914308 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-mb8qw" event={"ID":"c0d41fd9-3e78-4e48-ba89-6acc88459df8","Type":"ContainerDied","Data":"fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.914336 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-mb8qw" event={"ID":"c0d41fd9-3e78-4e48-ba89-6acc88459df8","Type":"ContainerStarted","Data":"337742653f05fa5f436d4d079724cbda10debf60e9d8bafa93c82d074e7fe2f4"} Dec 16 07:18:38 crc kubenswrapper[4823]: I1216 07:18:38.939642 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.939618937 podStartE2EDuration="4.939618937s" podCreationTimestamp="2025-12-16 07:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:38.931310357 +0000 UTC m=+1397.419876500" watchObservedRunningTime="2025-12-16 07:18:38.939618937 +0000 UTC m=+1397.428185060" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.027330 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.027312464 podStartE2EDuration="5.027312464s" podCreationTimestamp="2025-12-16 07:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:39.026910852 +0000 UTC m=+1397.515476965" watchObservedRunningTime="2025-12-16 07:18:39.027312464 +0000 UTC m=+1397.515878587" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.623256 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bbf9986cc-sjljb"] Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.625021 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.630396 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.630648 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.642245 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bbf9986cc-sjljb"] Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771077 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-combined-ca-bundle\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771167 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtbf\" (UniqueName: \"kubernetes.io/projected/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-kube-api-access-hbtbf\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771210 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-httpd-config\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771263 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-ovndb-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771300 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-public-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771323 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-internal-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.771386 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-config\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.872827 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-combined-ca-bundle\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.874160 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtbf\" (UniqueName: \"kubernetes.io/projected/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-kube-api-access-hbtbf\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.874538 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-httpd-config\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.874828 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-ovndb-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.874996 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-public-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.875120 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-internal-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.875356 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-config\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.888958 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-combined-ca-bundle\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.889853 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-ovndb-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.890422 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-internal-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.897405 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-public-tls-certs\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.910955 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-config\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.928874 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-httpd-config\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.930286 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtbf\" (UniqueName: \"kubernetes.io/projected/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-kube-api-access-hbtbf\") pod \"neutron-bbf9986cc-sjljb\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:39 crc kubenswrapper[4823]: I1216 07:18:39.965292 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.057075 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-mb8qw" event={"ID":"c0d41fd9-3e78-4e48-ba89-6acc88459df8","Type":"ContainerStarted","Data":"6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319"} Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.057629 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.461740 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.476228 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-685444497c-mb8qw" podStartSLOduration=3.476208845 podStartE2EDuration="3.476208845s" podCreationTimestamp="2025-12-16 07:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:40.107794305 +0000 UTC m=+1398.596360428" watchObservedRunningTime="2025-12-16 07:18:40.476208845 +0000 UTC m=+1398.964774968" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590075 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34693374-b301-47b2-b909-b5b93fd96fd0-logs\") pod \"34693374-b301-47b2-b909-b5b93fd96fd0\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590232 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d4gn\" (UniqueName: \"kubernetes.io/projected/34693374-b301-47b2-b909-b5b93fd96fd0-kube-api-access-5d4gn\") pod \"34693374-b301-47b2-b909-b5b93fd96fd0\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590307 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-scripts\") pod \"34693374-b301-47b2-b909-b5b93fd96fd0\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590343 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-combined-ca-bundle\") pod \"34693374-b301-47b2-b909-b5b93fd96fd0\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590385 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-config-data\") pod \"34693374-b301-47b2-b909-b5b93fd96fd0\" (UID: \"34693374-b301-47b2-b909-b5b93fd96fd0\") " Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590564 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34693374-b301-47b2-b909-b5b93fd96fd0-logs" (OuterVolumeSpecName: "logs") pod "34693374-b301-47b2-b909-b5b93fd96fd0" (UID: "34693374-b301-47b2-b909-b5b93fd96fd0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.590821 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34693374-b301-47b2-b909-b5b93fd96fd0-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.602267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-scripts" (OuterVolumeSpecName: "scripts") pod "34693374-b301-47b2-b909-b5b93fd96fd0" (UID: "34693374-b301-47b2-b909-b5b93fd96fd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.602871 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34693374-b301-47b2-b909-b5b93fd96fd0-kube-api-access-5d4gn" (OuterVolumeSpecName: "kube-api-access-5d4gn") pod "34693374-b301-47b2-b909-b5b93fd96fd0" (UID: "34693374-b301-47b2-b909-b5b93fd96fd0"). InnerVolumeSpecName "kube-api-access-5d4gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.624098 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-config-data" (OuterVolumeSpecName: "config-data") pod "34693374-b301-47b2-b909-b5b93fd96fd0" (UID: "34693374-b301-47b2-b909-b5b93fd96fd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.649185 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34693374-b301-47b2-b909-b5b93fd96fd0" (UID: "34693374-b301-47b2-b909-b5b93fd96fd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.692730 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.692766 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d4gn\" (UniqueName: \"kubernetes.io/projected/34693374-b301-47b2-b909-b5b93fd96fd0-kube-api-access-5d4gn\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.692777 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.692785 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34693374-b301-47b2-b909-b5b93fd96fd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:40 crc kubenswrapper[4823]: I1216 07:18:40.724849 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bbf9986cc-sjljb"] Dec 16 07:18:40 crc kubenswrapper[4823]: W1216 07:18:40.737782 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b1ed60_7cb0_48f0_aebf_3de778dbb95b.slice/crio-b946699914bb5413be32189c06d97b3818db2811af84ebe07b2bb0e71fc2447b WatchSource:0}: Error finding container b946699914bb5413be32189c06d97b3818db2811af84ebe07b2bb0e71fc2447b: Status 404 returned error can't find the container with id b946699914bb5413be32189c06d97b3818db2811af84ebe07b2bb0e71fc2447b Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.052042 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59fd5f5fb-h7tf5"] Dec 16 07:18:41 crc kubenswrapper[4823]: E1216 07:18:41.053459 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34693374-b301-47b2-b909-b5b93fd96fd0" containerName="placement-db-sync" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.053478 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="34693374-b301-47b2-b909-b5b93fd96fd0" containerName="placement-db-sync" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.053657 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="34693374-b301-47b2-b909-b5b93fd96fd0" containerName="placement-db-sync" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.056269 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.061483 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.061750 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.068967 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59fd5f5fb-h7tf5"] Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.082319 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf9986cc-sjljb" event={"ID":"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b","Type":"ContainerStarted","Data":"4a902115438412f167a7c224fe223d644746f437002cb2288beb05ad185be48a"} Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.082362 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf9986cc-sjljb" event={"ID":"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b","Type":"ContainerStarted","Data":"b946699914bb5413be32189c06d97b3818db2811af84ebe07b2bb0e71fc2447b"} Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.095442 4823 generic.go:334] "Generic (PLEG): container finished" podID="9dca6476-18f2-4c1a-8c95-e894c5f9facd" containerID="0697000ce67a8cd70a4f5cf3424d1c8f80091a2488379262811ac7ef93a7f556" exitCode=0 Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.095536 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-np6b5" event={"ID":"9dca6476-18f2-4c1a-8c95-e894c5f9facd","Type":"ContainerDied","Data":"0697000ce67a8cd70a4f5cf3424d1c8f80091a2488379262811ac7ef93a7f556"} Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.112383 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mm88" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.113087 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mm88" event={"ID":"34693374-b301-47b2-b909-b5b93fd96fd0","Type":"ContainerDied","Data":"88129a2199ae5e8da9ef90f6ca686ccfaf33226c9e85a68d24e7f99a929f5c40"} Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.113116 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88129a2199ae5e8da9ef90f6ca686ccfaf33226c9e85a68d24e7f99a929f5c40" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.213601 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-logs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.213707 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-scripts\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.213776 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-internal-tls-certs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.213806 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-public-tls-certs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.213835 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bdn4\" (UniqueName: \"kubernetes.io/projected/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-kube-api-access-4bdn4\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.213942 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-config-data\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.214009 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-combined-ca-bundle\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317142 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-logs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317210 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-scripts\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317275 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-internal-tls-certs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317304 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-public-tls-certs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317331 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bdn4\" (UniqueName: \"kubernetes.io/projected/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-kube-api-access-4bdn4\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317408 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-config-data\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317465 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-combined-ca-bundle\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.317841 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-logs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.323947 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-combined-ca-bundle\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.324583 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-config-data\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.324938 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-public-tls-certs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.325302 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-internal-tls-certs\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.326124 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-scripts\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.344394 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bdn4\" (UniqueName: \"kubernetes.io/projected/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-kube-api-access-4bdn4\") pod \"placement-59fd5f5fb-h7tf5\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:41 crc kubenswrapper[4823]: I1216 07:18:41.397483 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.308942 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.309689 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.343637 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.347255 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.368777 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.368820 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.403221 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:45 crc kubenswrapper[4823]: I1216 07:18:45.409938 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:46 crc kubenswrapper[4823]: I1216 07:18:46.156772 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:18:46 crc kubenswrapper[4823]: I1216 07:18:46.157128 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:46 crc kubenswrapper[4823]: I1216 07:18:46.157147 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:18:46 crc kubenswrapper[4823]: I1216 07:18:46.157158 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:47 crc kubenswrapper[4823]: I1216 07:18:47.396479 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:18:47 crc kubenswrapper[4823]: I1216 07:18:47.483071 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-wxg9m"] Dec 16 07:18:47 crc kubenswrapper[4823]: I1216 07:18:47.483307 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" containerName="dnsmasq-dns" containerID="cri-o://915bf11a2d2e78e3982dc67bdcd3b8756575f7fc1659fb4dbf5ed5f2329a9e62" gracePeriod=10 Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.193842 4823 generic.go:334] "Generic (PLEG): container finished" podID="166784d2-df96-4ee8-a1a3-22a967bff610" containerID="915bf11a2d2e78e3982dc67bdcd3b8756575f7fc1659fb4dbf5ed5f2329a9e62" exitCode=0 Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.193922 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" event={"ID":"166784d2-df96-4ee8-a1a3-22a967bff610","Type":"ContainerDied","Data":"915bf11a2d2e78e3982dc67bdcd3b8756575f7fc1659fb4dbf5ed5f2329a9e62"} Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.194541 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.194581 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.194593 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.194624 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.484270 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.490086 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.800597 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:48 crc kubenswrapper[4823]: I1216 07:18:48.961384 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.077656 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.236572 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-scripts\") pod \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.236672 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxk85\" (UniqueName: \"kubernetes.io/projected/9dca6476-18f2-4c1a-8c95-e894c5f9facd-kube-api-access-wxk85\") pod \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.236730 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-fernet-keys\") pod \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.236792 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-credential-keys\") pod \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.236862 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-config-data\") pod \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.236924 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-combined-ca-bundle\") pod \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\" (UID: \"9dca6476-18f2-4c1a-8c95-e894c5f9facd\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.276892 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9dca6476-18f2-4c1a-8c95-e894c5f9facd" (UID: "9dca6476-18f2-4c1a-8c95-e894c5f9facd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.298890 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9dca6476-18f2-4c1a-8c95-e894c5f9facd" (UID: "9dca6476-18f2-4c1a-8c95-e894c5f9facd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.302376 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-scripts" (OuterVolumeSpecName: "scripts") pod "9dca6476-18f2-4c1a-8c95-e894c5f9facd" (UID: "9dca6476-18f2-4c1a-8c95-e894c5f9facd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.308385 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dca6476-18f2-4c1a-8c95-e894c5f9facd-kube-api-access-wxk85" (OuterVolumeSpecName: "kube-api-access-wxk85") pod "9dca6476-18f2-4c1a-8c95-e894c5f9facd" (UID: "9dca6476-18f2-4c1a-8c95-e894c5f9facd"). InnerVolumeSpecName "kube-api-access-wxk85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.339308 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.339347 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxk85\" (UniqueName: \"kubernetes.io/projected/9dca6476-18f2-4c1a-8c95-e894c5f9facd-kube-api-access-wxk85\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.339361 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.339374 4823 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.360509 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-np6b5" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.360677 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-np6b5" event={"ID":"9dca6476-18f2-4c1a-8c95-e894c5f9facd","Type":"ContainerDied","Data":"c637a3efacc029888e5e245c83a83ee6e3f12bf8f08992e439e061e4db9076eb"} Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.360712 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c637a3efacc029888e5e245c83a83ee6e3f12bf8f08992e439e061e4db9076eb" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.366538 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dca6476-18f2-4c1a-8c95-e894c5f9facd" (UID: "9dca6476-18f2-4c1a-8c95-e894c5f9facd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.427214 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-config-data" (OuterVolumeSpecName: "config-data") pod "9dca6476-18f2-4c1a-8c95-e894c5f9facd" (UID: "9dca6476-18f2-4c1a-8c95-e894c5f9facd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.441053 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.441108 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dca6476-18f2-4c1a-8c95-e894c5f9facd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.570472 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.648132 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-svc\") pod \"166784d2-df96-4ee8-a1a3-22a967bff610\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.648191 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzvxj\" (UniqueName: \"kubernetes.io/projected/166784d2-df96-4ee8-a1a3-22a967bff610-kube-api-access-rzvxj\") pod \"166784d2-df96-4ee8-a1a3-22a967bff610\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.648220 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-swift-storage-0\") pod \"166784d2-df96-4ee8-a1a3-22a967bff610\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.648303 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-nb\") pod \"166784d2-df96-4ee8-a1a3-22a967bff610\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.648384 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-sb\") pod \"166784d2-df96-4ee8-a1a3-22a967bff610\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.648449 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-config\") pod \"166784d2-df96-4ee8-a1a3-22a967bff610\" (UID: \"166784d2-df96-4ee8-a1a3-22a967bff610\") " Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.677195 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166784d2-df96-4ee8-a1a3-22a967bff610-kube-api-access-rzvxj" (OuterVolumeSpecName: "kube-api-access-rzvxj") pod "166784d2-df96-4ee8-a1a3-22a967bff610" (UID: "166784d2-df96-4ee8-a1a3-22a967bff610"). InnerVolumeSpecName "kube-api-access-rzvxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.734162 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-config" (OuterVolumeSpecName: "config") pod "166784d2-df96-4ee8-a1a3-22a967bff610" (UID: "166784d2-df96-4ee8-a1a3-22a967bff610"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.759667 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzvxj\" (UniqueName: \"kubernetes.io/projected/166784d2-df96-4ee8-a1a3-22a967bff610-kube-api-access-rzvxj\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.760015 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.761325 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "166784d2-df96-4ee8-a1a3-22a967bff610" (UID: "166784d2-df96-4ee8-a1a3-22a967bff610"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.816697 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "166784d2-df96-4ee8-a1a3-22a967bff610" (UID: "166784d2-df96-4ee8-a1a3-22a967bff610"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.822129 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "166784d2-df96-4ee8-a1a3-22a967bff610" (UID: "166784d2-df96-4ee8-a1a3-22a967bff610"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.822836 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "166784d2-df96-4ee8-a1a3-22a967bff610" (UID: "166784d2-df96-4ee8-a1a3-22a967bff610"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.865221 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.865252 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.865285 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:50 crc kubenswrapper[4823]: I1216 07:18:50.865297 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166784d2-df96-4ee8-a1a3-22a967bff610-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.236088 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6c7767d9f4-5rbv6"] Dec 16 07:18:51 crc kubenswrapper[4823]: E1216 07:18:51.237085 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" containerName="dnsmasq-dns" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.237098 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" containerName="dnsmasq-dns" Dec 16 07:18:51 crc kubenswrapper[4823]: E1216 07:18:51.237122 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dca6476-18f2-4c1a-8c95-e894c5f9facd" containerName="keystone-bootstrap" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.237129 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dca6476-18f2-4c1a-8c95-e894c5f9facd" containerName="keystone-bootstrap" Dec 16 07:18:51 crc kubenswrapper[4823]: E1216 07:18:51.237153 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" containerName="init" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.237159 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" containerName="init" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.237452 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" containerName="dnsmasq-dns" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.237472 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dca6476-18f2-4c1a-8c95-e894c5f9facd" containerName="keystone-bootstrap" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.238259 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.242905 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.255419 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wtcrq" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.255703 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.255822 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.256017 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.256594 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.333119 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6c7767d9f4-5rbv6"] Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.349941 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59fd5f5fb-h7tf5"] Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387656 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-credential-keys\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387718 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5tz\" (UniqueName: \"kubernetes.io/projected/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-kube-api-access-nw5tz\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387781 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-fernet-keys\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387804 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-public-tls-certs\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387841 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-combined-ca-bundle\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387907 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-internal-tls-certs\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.387959 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-config-data\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.388005 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-scripts\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.445100 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd5f5fb-h7tf5" event={"ID":"196356f3-e866-4cf1-b3e8-eba3d9e4c99f","Type":"ContainerStarted","Data":"541126e09e93db247581ec589e02c3df986338da5e0953de66629883930267f7"} Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.482046 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" event={"ID":"166784d2-df96-4ee8-a1a3-22a967bff610","Type":"ContainerDied","Data":"b84d1788d23315388f454377246223a6daabce45f9f84cd5e6fb1c9f2bb2087a"} Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.482101 4823 scope.go:117] "RemoveContainer" containerID="915bf11a2d2e78e3982dc67bdcd3b8756575f7fc1659fb4dbf5ed5f2329a9e62" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.482228 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-wxg9m" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492134 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-config-data\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492190 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-scripts\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492233 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-credential-keys\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492265 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5tz\" (UniqueName: \"kubernetes.io/projected/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-kube-api-access-nw5tz\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492302 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-fernet-keys\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492331 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-public-tls-certs\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492358 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-combined-ca-bundle\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.492402 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-internal-tls-certs\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.498635 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-internal-tls-certs\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.512557 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-config-data\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.515720 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-combined-ca-bundle\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.516354 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-credential-keys\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.517137 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-scripts\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.523679 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5tz\" (UniqueName: \"kubernetes.io/projected/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-kube-api-access-nw5tz\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.524356 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-public-tls-certs\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.526065 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf9986cc-sjljb" event={"ID":"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b","Type":"ContainerStarted","Data":"63b9a035e047de6a0a1943c6d043167a9dedd896ef10da24426158630e0de9b7"} Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.527059 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.534418 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-fernet-keys\") pod \"keystone-6c7767d9f4-5rbv6\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.550113 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-wxg9m"] Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.560762 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q69qd" event={"ID":"59c74f3a-8b4c-47eb-8d8d-af32e667d121","Type":"ContainerStarted","Data":"9645652666b527c7d0539b4988e942317ac4144cec04a21d624b294741e7213e"} Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.566838 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-wxg9m"] Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.586295 4823 scope.go:117] "RemoveContainer" containerID="4c4254e8613eac97614956581232adf031224db77cbd5566f427b617354b173f" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.589069 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bbf9986cc-sjljb" podStartSLOduration=12.58905613 podStartE2EDuration="12.58905613s" podCreationTimestamp="2025-12-16 07:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:51.580787711 +0000 UTC m=+1410.069353834" watchObservedRunningTime="2025-12-16 07:18:51.58905613 +0000 UTC m=+1410.077622253" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.606784 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-q69qd" podStartSLOduration=3.203605481 podStartE2EDuration="40.606759885s" podCreationTimestamp="2025-12-16 07:18:11 +0000 UTC" firstStartedPulling="2025-12-16 07:18:12.842114529 +0000 UTC m=+1371.330680652" lastFinishedPulling="2025-12-16 07:18:50.245268933 +0000 UTC m=+1408.733835056" observedRunningTime="2025-12-16 07:18:51.602395438 +0000 UTC m=+1410.090961561" watchObservedRunningTime="2025-12-16 07:18:51.606759885 +0000 UTC m=+1410.095326008" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.616177 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerStarted","Data":"613f620bca924e060d3999a84035724c6efbc43f0b4c818377f5ccf218c6557f"} Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.768419 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:51 crc kubenswrapper[4823]: I1216 07:18:51.784165 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166784d2-df96-4ee8-a1a3-22a967bff610" path="/var/lib/kubelet/pods/166784d2-df96-4ee8-a1a3-22a967bff610/volumes" Dec 16 07:18:52 crc kubenswrapper[4823]: I1216 07:18:52.254089 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6c7767d9f4-5rbv6"] Dec 16 07:18:52 crc kubenswrapper[4823]: W1216 07:18:52.255353 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7a88b40_28bf_4b43_bed8_0b3df3baec5c.slice/crio-bb835f7ff7aad3e6fa75a7c0216849fa0da434bcb6199e32d801c9063e346a67 WatchSource:0}: Error finding container bb835f7ff7aad3e6fa75a7c0216849fa0da434bcb6199e32d801c9063e346a67: Status 404 returned error can't find the container with id bb835f7ff7aad3e6fa75a7c0216849fa0da434bcb6199e32d801c9063e346a67 Dec 16 07:18:52 crc kubenswrapper[4823]: I1216 07:18:52.634184 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c7767d9f4-5rbv6" event={"ID":"d7a88b40-28bf-4b43-bed8-0b3df3baec5c","Type":"ContainerStarted","Data":"bb835f7ff7aad3e6fa75a7c0216849fa0da434bcb6199e32d801c9063e346a67"} Dec 16 07:18:52 crc kubenswrapper[4823]: I1216 07:18:52.639292 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd5f5fb-h7tf5" event={"ID":"196356f3-e866-4cf1-b3e8-eba3d9e4c99f","Type":"ContainerStarted","Data":"754f57f4d21e96f08486902a1f29fc3d73326be71cf93cc74a912ea8e5adfbfe"} Dec 16 07:18:52 crc kubenswrapper[4823]: I1216 07:18:52.643586 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2br8" event={"ID":"fcd5e697-1360-4376-8160-ba0bc7fa56f8","Type":"ContainerStarted","Data":"31248dd72823a40fe4ee23b7fbaa7a419c7a61036cec2f854466ad00f8a80f4b"} Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.711878 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c7767d9f4-5rbv6" event={"ID":"d7a88b40-28bf-4b43-bed8-0b3df3baec5c","Type":"ContainerStarted","Data":"c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4"} Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.713324 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.715799 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd5f5fb-h7tf5" event={"ID":"196356f3-e866-4cf1-b3e8-eba3d9e4c99f","Type":"ContainerStarted","Data":"fa7ad139671c8c3444b9e62aff507fb0fc6b2d2d087722f71ba9f8cc7977708c"} Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.715832 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.716287 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.760945 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6c7767d9f4-5rbv6" podStartSLOduration=2.7609270439999998 podStartE2EDuration="2.760927044s" podCreationTimestamp="2025-12-16 07:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:53.753407108 +0000 UTC m=+1412.241973241" watchObservedRunningTime="2025-12-16 07:18:53.760927044 +0000 UTC m=+1412.249493167" Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.796427 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-n2br8" podStartSLOduration=5.475004062 podStartE2EDuration="43.796407296s" podCreationTimestamp="2025-12-16 07:18:10 +0000 UTC" firstStartedPulling="2025-12-16 07:18:11.956013516 +0000 UTC m=+1370.444579639" lastFinishedPulling="2025-12-16 07:18:50.27741675 +0000 UTC m=+1408.765982873" observedRunningTime="2025-12-16 07:18:53.785480113 +0000 UTC m=+1412.274046236" watchObservedRunningTime="2025-12-16 07:18:53.796407296 +0000 UTC m=+1412.284973419" Dec 16 07:18:53 crc kubenswrapper[4823]: I1216 07:18:53.814801 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59fd5f5fb-h7tf5" podStartSLOduration=12.81477647 podStartE2EDuration="12.81477647s" podCreationTimestamp="2025-12-16 07:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:18:53.81283713 +0000 UTC m=+1412.301403273" watchObservedRunningTime="2025-12-16 07:18:53.81477647 +0000 UTC m=+1412.303342593" Dec 16 07:18:54 crc kubenswrapper[4823]: I1216 07:18:54.726353 4823 generic.go:334] "Generic (PLEG): container finished" podID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" containerID="9645652666b527c7d0539b4988e942317ac4144cec04a21d624b294741e7213e" exitCode=0 Dec 16 07:18:54 crc kubenswrapper[4823]: I1216 07:18:54.726444 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q69qd" event={"ID":"59c74f3a-8b4c-47eb-8d8d-af32e667d121","Type":"ContainerDied","Data":"9645652666b527c7d0539b4988e942317ac4144cec04a21d624b294741e7213e"} Dec 16 07:18:57 crc kubenswrapper[4823]: I1216 07:18:57.764495 4823 generic.go:334] "Generic (PLEG): container finished" podID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" containerID="31248dd72823a40fe4ee23b7fbaa7a419c7a61036cec2f854466ad00f8a80f4b" exitCode=0 Dec 16 07:18:57 crc kubenswrapper[4823]: I1216 07:18:57.764596 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2br8" event={"ID":"fcd5e697-1360-4376-8160-ba0bc7fa56f8","Type":"ContainerDied","Data":"31248dd72823a40fe4ee23b7fbaa7a419c7a61036cec2f854466ad00f8a80f4b"} Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.138929 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.249402 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-db-sync-config-data\") pod \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.249447 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lldfz\" (UniqueName: \"kubernetes.io/projected/59c74f3a-8b4c-47eb-8d8d-af32e667d121-kube-api-access-lldfz\") pod \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.249541 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-combined-ca-bundle\") pod \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\" (UID: \"59c74f3a-8b4c-47eb-8d8d-af32e667d121\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.256124 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c74f3a-8b4c-47eb-8d8d-af32e667d121-kube-api-access-lldfz" (OuterVolumeSpecName: "kube-api-access-lldfz") pod "59c74f3a-8b4c-47eb-8d8d-af32e667d121" (UID: "59c74f3a-8b4c-47eb-8d8d-af32e667d121"). InnerVolumeSpecName "kube-api-access-lldfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.256163 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "59c74f3a-8b4c-47eb-8d8d-af32e667d121" (UID: "59c74f3a-8b4c-47eb-8d8d-af32e667d121"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.282452 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59c74f3a-8b4c-47eb-8d8d-af32e667d121" (UID: "59c74f3a-8b4c-47eb-8d8d-af32e667d121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.351834 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.351888 4823 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c74f3a-8b4c-47eb-8d8d-af32e667d121-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.351900 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lldfz\" (UniqueName: \"kubernetes.io/projected/59c74f3a-8b4c-47eb-8d8d-af32e667d121-kube-api-access-lldfz\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.522239 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.655492 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-db-sync-config-data\") pod \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.655579 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-scripts\") pod \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.655609 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4dkp\" (UniqueName: \"kubernetes.io/projected/fcd5e697-1360-4376-8160-ba0bc7fa56f8-kube-api-access-m4dkp\") pod \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.655634 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fcd5e697-1360-4376-8160-ba0bc7fa56f8-etc-machine-id\") pod \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.655694 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-config-data\") pod \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.655765 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-combined-ca-bundle\") pod \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\" (UID: \"fcd5e697-1360-4376-8160-ba0bc7fa56f8\") " Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.656460 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcd5e697-1360-4376-8160-ba0bc7fa56f8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fcd5e697-1360-4376-8160-ba0bc7fa56f8" (UID: "fcd5e697-1360-4376-8160-ba0bc7fa56f8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.659323 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fcd5e697-1360-4376-8160-ba0bc7fa56f8" (UID: "fcd5e697-1360-4376-8160-ba0bc7fa56f8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.660193 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd5e697-1360-4376-8160-ba0bc7fa56f8-kube-api-access-m4dkp" (OuterVolumeSpecName: "kube-api-access-m4dkp") pod "fcd5e697-1360-4376-8160-ba0bc7fa56f8" (UID: "fcd5e697-1360-4376-8160-ba0bc7fa56f8"). InnerVolumeSpecName "kube-api-access-m4dkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.660663 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-scripts" (OuterVolumeSpecName: "scripts") pod "fcd5e697-1360-4376-8160-ba0bc7fa56f8" (UID: "fcd5e697-1360-4376-8160-ba0bc7fa56f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.691691 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcd5e697-1360-4376-8160-ba0bc7fa56f8" (UID: "fcd5e697-1360-4376-8160-ba0bc7fa56f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: E1216 07:18:59.695776 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.706488 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-config-data" (OuterVolumeSpecName: "config-data") pod "fcd5e697-1360-4376-8160-ba0bc7fa56f8" (UID: "fcd5e697-1360-4376-8160-ba0bc7fa56f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.757331 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.757371 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.757388 4823 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.757397 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcd5e697-1360-4376-8160-ba0bc7fa56f8-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.757406 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4dkp\" (UniqueName: \"kubernetes.io/projected/fcd5e697-1360-4376-8160-ba0bc7fa56f8-kube-api-access-m4dkp\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.757416 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fcd5e697-1360-4376-8160-ba0bc7fa56f8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.790140 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2br8" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.791923 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q69qd" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.798071 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2br8" event={"ID":"fcd5e697-1360-4376-8160-ba0bc7fa56f8","Type":"ContainerDied","Data":"fbcfb626861a05972d3142b45ff6fa52e798cda73a2a2f309993c022e980d418"} Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.798112 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbcfb626861a05972d3142b45ff6fa52e798cda73a2a2f309993c022e980d418" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.798125 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q69qd" event={"ID":"59c74f3a-8b4c-47eb-8d8d-af32e667d121","Type":"ContainerDied","Data":"ed78fb69cd2dd95f410ea4905a49bb5c04f60faefd3d37dca05b373efbcfb246"} Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.798134 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed78fb69cd2dd95f410ea4905a49bb5c04f60faefd3d37dca05b373efbcfb246" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.807973 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerStarted","Data":"b2ec0bdc64647f0f25b6b8ce3a675ed50d759193297e4d2a2fde6255b767495b"} Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.808146 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="ceilometer-notification-agent" containerID="cri-o://e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc" gracePeriod=30 Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.808418 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.808714 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="proxy-httpd" containerID="cri-o://b2ec0bdc64647f0f25b6b8ce3a675ed50d759193297e4d2a2fde6255b767495b" gracePeriod=30 Dec 16 07:18:59 crc kubenswrapper[4823]: I1216 07:18:59.808779 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="sg-core" containerID="cri-o://613f620bca924e060d3999a84035724c6efbc43f0b4c818377f5ccf218c6557f" gracePeriod=30 Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.077730 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:00 crc kubenswrapper[4823]: E1216 07:19:00.078470 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" containerName="barbican-db-sync" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.078496 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" containerName="barbican-db-sync" Dec 16 07:19:00 crc kubenswrapper[4823]: E1216 07:19:00.078513 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" containerName="cinder-db-sync" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.078523 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" containerName="cinder-db-sync" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.078742 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" containerName="cinder-db-sync" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.078807 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" containerName="barbican-db-sync" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.079936 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.084156 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.084235 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.084164 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jg2mt" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.087189 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.097343 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.177837 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e69e0bb-4482-4f95-b26e-d129784035d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.187013 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5pcj\" (UniqueName: \"kubernetes.io/projected/1e69e0bb-4482-4f95-b26e-d129784035d0-kube-api-access-n5pcj\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.187236 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.187412 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.187462 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.187500 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.198854 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6d6ddd89-h4vj9"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.200889 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.220258 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6d6ddd89-h4vj9"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289763 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289814 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289842 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8r8\" (UniqueName: \"kubernetes.io/projected/bd08d68a-e304-44ba-a7bc-2545ba318c5f-kube-api-access-xl8r8\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289865 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289905 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289933 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e69e0bb-4482-4f95-b26e-d129784035d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289954 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5pcj\" (UniqueName: \"kubernetes.io/projected/1e69e0bb-4482-4f95-b26e-d129784035d0-kube-api-access-n5pcj\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.289982 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.290016 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-svc\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.290055 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-config\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.290075 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.290092 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.293149 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e69e0bb-4482-4f95-b26e-d129784035d0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.297264 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.297584 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.298106 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.306403 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-scripts\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.338899 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.339780 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5pcj\" (UniqueName: \"kubernetes.io/projected/1e69e0bb-4482-4f95-b26e-d129784035d0-kube-api-access-n5pcj\") pod \"cinder-scheduler-0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.341141 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.344521 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.382577 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.391824 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8r8\" (UniqueName: \"kubernetes.io/projected/bd08d68a-e304-44ba-a7bc-2545ba318c5f-kube-api-access-xl8r8\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.391891 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.391932 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.391963 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-svc\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.391992 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-config\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.392011 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.392965 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.393763 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.394395 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.395884 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-svc\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.396131 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-config\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.451867 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8r8\" (UniqueName: \"kubernetes.io/projected/bd08d68a-e304-44ba-a7bc-2545ba318c5f-kube-api-access-xl8r8\") pod \"dnsmasq-dns-6f6d6ddd89-h4vj9\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.463655 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.490096 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-99f9cf477-cj5ss"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.491981 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.493579 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvfr\" (UniqueName: \"kubernetes.io/projected/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-kube-api-access-6gvfr\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.493621 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-scripts\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.493646 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.493680 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.493719 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-logs\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.493911 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.494009 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data-custom\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.500967 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.501283 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6gt6g" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.502162 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.526558 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-749d6ff74-w7lnp"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.531580 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.537122 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.543085 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-99f9cf477-cj5ss"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.556682 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.569562 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-749d6ff74-w7lnp"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.600782 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.600877 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data-custom\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.600950 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data-custom\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.600978 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-combined-ca-bundle\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601067 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601131 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-logs\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601172 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601210 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvfr\" (UniqueName: \"kubernetes.io/projected/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-kube-api-access-6gvfr\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601237 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-scripts\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601261 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601285 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pdt\" (UniqueName: \"kubernetes.io/projected/22db0f3f-88b5-4909-aa80-f4b020d1ce18-kube-api-access-m7pdt\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601316 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg59w\" (UniqueName: \"kubernetes.io/projected/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-kube-api-access-sg59w\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601343 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-combined-ca-bundle\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601383 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601422 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data-custom\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601461 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-logs\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.601525 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22db0f3f-88b5-4909-aa80-f4b020d1ce18-logs\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.608149 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.608587 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-logs\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.615851 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.626166 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data-custom\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.629299 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.663756 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-scripts\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.696280 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvfr\" (UniqueName: \"kubernetes.io/projected/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-kube-api-access-6gvfr\") pod \"cinder-api-0\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703613 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data-custom\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703690 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-combined-ca-bundle\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703756 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703833 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-logs\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703872 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703937 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pdt\" (UniqueName: \"kubernetes.io/projected/22db0f3f-88b5-4909-aa80-f4b020d1ce18-kube-api-access-m7pdt\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.703983 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg59w\" (UniqueName: \"kubernetes.io/projected/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-kube-api-access-sg59w\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.704008 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-combined-ca-bundle\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.704070 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data-custom\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.704159 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22db0f3f-88b5-4909-aa80-f4b020d1ce18-logs\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.704965 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22db0f3f-88b5-4909-aa80-f4b020d1ce18-logs\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.708483 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-logs\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.715637 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.734634 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.739729 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data-custom\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.740424 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data-custom\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.740865 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-combined-ca-bundle\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.744986 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg59w\" (UniqueName: \"kubernetes.io/projected/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-kube-api-access-sg59w\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.750331 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6d6ddd89-h4vj9"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.753988 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-combined-ca-bundle\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.766310 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data\") pod \"barbican-keystone-listener-749d6ff74-w7lnp\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.792847 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pdt\" (UniqueName: \"kubernetes.io/projected/22db0f3f-88b5-4909-aa80-f4b020d1ce18-kube-api-access-m7pdt\") pod \"barbican-worker-99f9cf477-cj5ss\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.909136 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-zxsbt"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.910575 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.912700 4823 generic.go:334] "Generic (PLEG): container finished" podID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerID="b2ec0bdc64647f0f25b6b8ce3a675ed50d759193297e4d2a2fde6255b767495b" exitCode=0 Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.912742 4823 generic.go:334] "Generic (PLEG): container finished" podID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerID="613f620bca924e060d3999a84035724c6efbc43f0b4c818377f5ccf218c6557f" exitCode=2 Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.912763 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerDied","Data":"b2ec0bdc64647f0f25b6b8ce3a675ed50d759193297e4d2a2fde6255b767495b"} Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.912791 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerDied","Data":"613f620bca924e060d3999a84035724c6efbc43f0b4c818377f5ccf218c6557f"} Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.941689 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.942839 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-zxsbt"] Dec 16 07:19:00 crc kubenswrapper[4823]: I1216 07:19:00.984614 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.003151 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58d7958684-mfsdc"] Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.004614 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.019428 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.060477 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-config\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.060512 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.060585 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.060611 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8tlt\" (UniqueName: \"kubernetes.io/projected/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-kube-api-access-q8tlt\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.060676 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.060696 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.074569 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58d7958684-mfsdc"] Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162577 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-config\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162617 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162660 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162750 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162775 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8tlt\" (UniqueName: \"kubernetes.io/projected/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-kube-api-access-q8tlt\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162800 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-combined-ca-bundle\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162851 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472078f0-72f2-4e3e-a626-8a98e3329fe6-logs\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162907 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162923 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txrv\" (UniqueName: \"kubernetes.io/projected/472078f0-72f2-4e3e-a626-8a98e3329fe6-kube-api-access-2txrv\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162944 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.162989 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data-custom\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.164084 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-config\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.164084 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.165843 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.173616 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.182571 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.202990 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8tlt\" (UniqueName: \"kubernetes.io/projected/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-kube-api-access-q8tlt\") pod \"dnsmasq-dns-75dbb546bf-zxsbt\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.268789 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txrv\" (UniqueName: \"kubernetes.io/projected/472078f0-72f2-4e3e-a626-8a98e3329fe6-kube-api-access-2txrv\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.269312 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data-custom\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.269384 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.270470 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-combined-ca-bundle\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.270556 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472078f0-72f2-4e3e-a626-8a98e3329fe6-logs\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.271076 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472078f0-72f2-4e3e-a626-8a98e3329fe6-logs\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.278575 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data-custom\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.288015 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.292523 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-combined-ca-bundle\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.302773 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txrv\" (UniqueName: \"kubernetes.io/projected/472078f0-72f2-4e3e-a626-8a98e3329fe6-kube-api-access-2txrv\") pod \"barbican-api-58d7958684-mfsdc\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.332536 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.379966 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:01 crc kubenswrapper[4823]: W1216 07:19:01.382047 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e69e0bb_4482_4f95_b26e_d129784035d0.slice/crio-140af84f921b606a0b7c9baa955128fdbd81d52746839b0922c104b816105bd1 WatchSource:0}: Error finding container 140af84f921b606a0b7c9baa955128fdbd81d52746839b0922c104b816105bd1: Status 404 returned error can't find the container with id 140af84f921b606a0b7c9baa955128fdbd81d52746839b0922c104b816105bd1 Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.558526 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.573239 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6d6ddd89-h4vj9"] Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.714755 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.824845 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-749d6ff74-w7lnp"] Dec 16 07:19:01 crc kubenswrapper[4823]: W1216 07:19:01.837562 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22db0f3f_88b5_4909_aa80_f4b020d1ce18.slice/crio-04a40d440f394dcb42dbb0ead7abc03d86f229d9020fab91c176e64657413429 WatchSource:0}: Error finding container 04a40d440f394dcb42dbb0ead7abc03d86f229d9020fab91c176e64657413429: Status 404 returned error can't find the container with id 04a40d440f394dcb42dbb0ead7abc03d86f229d9020fab91c176e64657413429 Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.854861 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-99f9cf477-cj5ss"] Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.924794 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" event={"ID":"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6","Type":"ContainerStarted","Data":"65e3b1c7decd0499ef13149a8062101f21080b161212cffa19da508b806df46c"} Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.928052 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-99f9cf477-cj5ss" event={"ID":"22db0f3f-88b5-4909-aa80-f4b020d1ce18","Type":"ContainerStarted","Data":"04a40d440f394dcb42dbb0ead7abc03d86f229d9020fab91c176e64657413429"} Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.930111 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a","Type":"ContainerStarted","Data":"33edd90111724c4a16cf1c1680a380bb7fd0fd42427c17b0fafdeb431db19842"} Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.937621 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" event={"ID":"bd08d68a-e304-44ba-a7bc-2545ba318c5f","Type":"ContainerStarted","Data":"bd2c13872ff3e1f4400b7c61fb35b3c93e8774df2726642f816df932855bbe19"} Dec 16 07:19:01 crc kubenswrapper[4823]: I1216 07:19:01.944215 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1e69e0bb-4482-4f95-b26e-d129784035d0","Type":"ContainerStarted","Data":"140af84f921b606a0b7c9baa955128fdbd81d52746839b0922c104b816105bd1"} Dec 16 07:19:02 crc kubenswrapper[4823]: I1216 07:19:02.161852 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58d7958684-mfsdc"] Dec 16 07:19:02 crc kubenswrapper[4823]: W1216 07:19:02.373880 4823 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c10a9c_6dba_4d35_a0d8_2ef0b82352cb.slice/crio-conmon-e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc.scope/memory.swap.max": read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c10a9c_6dba_4d35_a0d8_2ef0b82352cb.slice/crio-conmon-e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc.scope/memory.swap.max: no such device Dec 16 07:19:02 crc kubenswrapper[4823]: I1216 07:19:02.389469 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-zxsbt"] Dec 16 07:19:02 crc kubenswrapper[4823]: W1216 07:19:02.457914 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52fb8160_a1e0_4b7e_a3ce_bd018dc8c512.slice/crio-3b938c5353d77ba2e003d77b926ed4311cd8b81b8961f00030320fb101dc9baa WatchSource:0}: Error finding container 3b938c5353d77ba2e003d77b926ed4311cd8b81b8961f00030320fb101dc9baa: Status 404 returned error can't find the container with id 3b938c5353d77ba2e003d77b926ed4311cd8b81b8961f00030320fb101dc9baa Dec 16 07:19:02 crc kubenswrapper[4823]: E1216 07:19:02.610327 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c10a9c_6dba_4d35_a0d8_2ef0b82352cb.slice/crio-conmon-e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd08d68a_e304_44ba_a7bc_2545ba318c5f.slice/crio-conmon-39b4a380251b27d9040d063a437748b482291ddd9afcdc666a1cd3646fad804a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd08d68a_e304_44ba_a7bc_2545ba318c5f.slice/crio-39b4a380251b27d9040d063a437748b482291ddd9afcdc666a1cd3646fad804a.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.057048 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d7958684-mfsdc" event={"ID":"472078f0-72f2-4e3e-a626-8a98e3329fe6","Type":"ContainerStarted","Data":"ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.057479 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d7958684-mfsdc" event={"ID":"472078f0-72f2-4e3e-a626-8a98e3329fe6","Type":"ContainerStarted","Data":"79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.057500 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d7958684-mfsdc" event={"ID":"472078f0-72f2-4e3e-a626-8a98e3329fe6","Type":"ContainerStarted","Data":"d7c298a5ca4a6849c10bbae0c2c999f343f42297d5c93e9df241c702c82bd9b4"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.074866 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a","Type":"ContainerStarted","Data":"7df7313948dc8aac9ecdd39743719fe4550166c29436ffebdf4751a6884734bf"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.103207 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" event={"ID":"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512","Type":"ContainerStarted","Data":"40d274525e2d9a0c414e988dd7895e2962042be8fc3f624f35efa6cd9aab456e"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.103271 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" event={"ID":"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512","Type":"ContainerStarted","Data":"3b938c5353d77ba2e003d77b926ed4311cd8b81b8961f00030320fb101dc9baa"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.144312 4823 generic.go:334] "Generic (PLEG): container finished" podID="bd08d68a-e304-44ba-a7bc-2545ba318c5f" containerID="39b4a380251b27d9040d063a437748b482291ddd9afcdc666a1cd3646fad804a" exitCode=0 Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.144411 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" event={"ID":"bd08d68a-e304-44ba-a7bc-2545ba318c5f","Type":"ContainerDied","Data":"39b4a380251b27d9040d063a437748b482291ddd9afcdc666a1cd3646fad804a"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.163863 4823 generic.go:334] "Generic (PLEG): container finished" podID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerID="e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc" exitCode=0 Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.163921 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerDied","Data":"e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc"} Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.168219 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.483578 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643551 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-sg-core-conf-yaml\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643609 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-log-httpd\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643713 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-scripts\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643742 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-run-httpd\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643786 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr9jj\" (UniqueName: \"kubernetes.io/projected/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-kube-api-access-qr9jj\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643833 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-config-data\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.643935 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-combined-ca-bundle\") pod \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\" (UID: \"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb\") " Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.653559 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.653893 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.682230 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-scripts" (OuterVolumeSpecName: "scripts") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.682367 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-kube-api-access-qr9jj" (OuterVolumeSpecName: "kube-api-access-qr9jj") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "kube-api-access-qr9jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.746106 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.746144 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.746157 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.746168 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr9jj\" (UniqueName: \"kubernetes.io/projected/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-kube-api-access-qr9jj\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.801749 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.850137 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.903829 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.945930 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.952431 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:03 crc kubenswrapper[4823]: I1216 07:19:03.992187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-config-data" (OuterVolumeSpecName: "config-data") pod "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" (UID: "22c10a9c-6dba-4d35-a0d8-2ef0b82352cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.053424 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-config\") pod \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.053500 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-sb\") pod \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.053531 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-nb\") pod \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.053591 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-swift-storage-0\") pod \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.053616 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl8r8\" (UniqueName: \"kubernetes.io/projected/bd08d68a-e304-44ba-a7bc-2545ba318c5f-kube-api-access-xl8r8\") pod \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.053647 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-svc\") pod \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\" (UID: \"bd08d68a-e304-44ba-a7bc-2545ba318c5f\") " Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.054870 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.065131 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd08d68a-e304-44ba-a7bc-2545ba318c5f-kube-api-access-xl8r8" (OuterVolumeSpecName: "kube-api-access-xl8r8") pod "bd08d68a-e304-44ba-a7bc-2545ba318c5f" (UID: "bd08d68a-e304-44ba-a7bc-2545ba318c5f"). InnerVolumeSpecName "kube-api-access-xl8r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.081072 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bd08d68a-e304-44ba-a7bc-2545ba318c5f" (UID: "bd08d68a-e304-44ba-a7bc-2545ba318c5f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.082059 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd08d68a-e304-44ba-a7bc-2545ba318c5f" (UID: "bd08d68a-e304-44ba-a7bc-2545ba318c5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.083942 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd08d68a-e304-44ba-a7bc-2545ba318c5f" (UID: "bd08d68a-e304-44ba-a7bc-2545ba318c5f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.091460 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-config" (OuterVolumeSpecName: "config") pod "bd08d68a-e304-44ba-a7bc-2545ba318c5f" (UID: "bd08d68a-e304-44ba-a7bc-2545ba318c5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.095873 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd08d68a-e304-44ba-a7bc-2545ba318c5f" (UID: "bd08d68a-e304-44ba-a7bc-2545ba318c5f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.156585 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.156624 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.156637 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.156648 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.156657 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd08d68a-e304-44ba-a7bc-2545ba318c5f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.156667 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl8r8\" (UniqueName: \"kubernetes.io/projected/bd08d68a-e304-44ba-a7bc-2545ba318c5f-kube-api-access-xl8r8\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.176505 4823 generic.go:334] "Generic (PLEG): container finished" podID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerID="40d274525e2d9a0c414e988dd7895e2962042be8fc3f624f35efa6cd9aab456e" exitCode=0 Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.176566 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" event={"ID":"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512","Type":"ContainerDied","Data":"40d274525e2d9a0c414e988dd7895e2962042be8fc3f624f35efa6cd9aab456e"} Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.184656 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" event={"ID":"bd08d68a-e304-44ba-a7bc-2545ba318c5f","Type":"ContainerDied","Data":"bd2c13872ff3e1f4400b7c61fb35b3c93e8774df2726642f816df932855bbe19"} Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.184701 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6d6ddd89-h4vj9" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.184709 4823 scope.go:117] "RemoveContainer" containerID="39b4a380251b27d9040d063a437748b482291ddd9afcdc666a1cd3646fad804a" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.194348 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1e69e0bb-4482-4f95-b26e-d129784035d0","Type":"ContainerStarted","Data":"01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd"} Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.198715 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c10a9c-6dba-4d35-a0d8-2ef0b82352cb","Type":"ContainerDied","Data":"d830021772bc6acdb57fc16639b21dc822f7b3e5addd734a34dd548a4cfce5b9"} Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.198759 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.201710 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api-log" containerID="cri-o://7df7313948dc8aac9ecdd39743719fe4550166c29436ffebdf4751a6884734bf" gracePeriod=30 Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.201969 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a","Type":"ContainerStarted","Data":"3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb"} Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.202001 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.202039 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.202057 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.202083 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api" containerID="cri-o://3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb" gracePeriod=30 Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.265621 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6d6ddd89-h4vj9"] Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.278203 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f6d6ddd89-h4vj9"] Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.282332 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.282310846 podStartE2EDuration="4.282310846s" podCreationTimestamp="2025-12-16 07:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:04.267975157 +0000 UTC m=+1422.756541280" watchObservedRunningTime="2025-12-16 07:19:04.282310846 +0000 UTC m=+1422.770876969" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.302569 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58d7958684-mfsdc" podStartSLOduration=4.30254447 podStartE2EDuration="4.30254447s" podCreationTimestamp="2025-12-16 07:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:04.30063829 +0000 UTC m=+1422.789204423" watchObservedRunningTime="2025-12-16 07:19:04.30254447 +0000 UTC m=+1422.791110593" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.365419 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.388094 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.415788 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:04 crc kubenswrapper[4823]: E1216 07:19:04.416197 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="ceilometer-notification-agent" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416213 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="ceilometer-notification-agent" Dec 16 07:19:04 crc kubenswrapper[4823]: E1216 07:19:04.416234 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="sg-core" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416240 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="sg-core" Dec 16 07:19:04 crc kubenswrapper[4823]: E1216 07:19:04.416264 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="proxy-httpd" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416270 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="proxy-httpd" Dec 16 07:19:04 crc kubenswrapper[4823]: E1216 07:19:04.416281 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd08d68a-e304-44ba-a7bc-2545ba318c5f" containerName="init" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416287 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd08d68a-e304-44ba-a7bc-2545ba318c5f" containerName="init" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416450 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="proxy-httpd" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416465 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="ceilometer-notification-agent" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416479 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd08d68a-e304-44ba-a7bc-2545ba318c5f" containerName="init" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.416490 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" containerName="sg-core" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.418038 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.423630 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.423861 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.424771 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.564955 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-run-httpd\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.565294 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-log-httpd\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.565357 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.565382 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vmn\" (UniqueName: \"kubernetes.io/projected/421bf187-1b90-4633-9cce-3ef3e0387343-kube-api-access-z4vmn\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.565516 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-config-data\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.565555 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-scripts\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.565610 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-config-data\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667515 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-scripts\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667540 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667623 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-run-httpd\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667641 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-log-httpd\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667680 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.667698 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vmn\" (UniqueName: \"kubernetes.io/projected/421bf187-1b90-4633-9cce-3ef3e0387343-kube-api-access-z4vmn\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.672605 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.672649 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.673009 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-scripts\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.673180 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-config-data\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.674694 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-log-httpd\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.675406 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-run-httpd\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.686777 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vmn\" (UniqueName: \"kubernetes.io/projected/421bf187-1b90-4633-9cce-3ef3e0387343-kube-api-access-z4vmn\") pod \"ceilometer-0\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " pod="openstack/ceilometer-0" Dec 16 07:19:04 crc kubenswrapper[4823]: I1216 07:19:04.752823 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.212062 4823 generic.go:334] "Generic (PLEG): container finished" podID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerID="7df7313948dc8aac9ecdd39743719fe4550166c29436ffebdf4751a6884734bf" exitCode=143 Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.212148 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a","Type":"ContainerDied","Data":"7df7313948dc8aac9ecdd39743719fe4550166c29436ffebdf4751a6884734bf"} Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.666472 4823 scope.go:117] "RemoveContainer" containerID="b2ec0bdc64647f0f25b6b8ce3a675ed50d759193297e4d2a2fde6255b767495b" Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.791990 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c10a9c-6dba-4d35-a0d8-2ef0b82352cb" path="/var/lib/kubelet/pods/22c10a9c-6dba-4d35-a0d8-2ef0b82352cb/volumes" Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.792928 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd08d68a-e304-44ba-a7bc-2545ba318c5f" path="/var/lib/kubelet/pods/bd08d68a-e304-44ba-a7bc-2545ba318c5f/volumes" Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.953386 4823 scope.go:117] "RemoveContainer" containerID="613f620bca924e060d3999a84035724c6efbc43f0b4c818377f5ccf218c6557f" Dec 16 07:19:05 crc kubenswrapper[4823]: I1216 07:19:05.994545 4823 scope.go:117] "RemoveContainer" containerID="e1d8bf154774451c8501c6f494acda39ed4df5c898f297ae1bf7629b652517dc" Dec 16 07:19:06 crc kubenswrapper[4823]: I1216 07:19:06.248752 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" event={"ID":"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512","Type":"ContainerStarted","Data":"66fb8cb9dc2bdcafdd3f90d87593590c0679946f3ccdc9113b87fb499e690755"} Dec 16 07:19:06 crc kubenswrapper[4823]: I1216 07:19:06.250324 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:06 crc kubenswrapper[4823]: I1216 07:19:06.297611 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" event={"ID":"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6","Type":"ContainerStarted","Data":"8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da"} Dec 16 07:19:06 crc kubenswrapper[4823]: I1216 07:19:06.298743 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" podStartSLOduration=6.298729041 podStartE2EDuration="6.298729041s" podCreationTimestamp="2025-12-16 07:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:06.284617639 +0000 UTC m=+1424.773183762" watchObservedRunningTime="2025-12-16 07:19:06.298729041 +0000 UTC m=+1424.787295174" Dec 16 07:19:06 crc kubenswrapper[4823]: I1216 07:19:06.313348 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-99f9cf477-cj5ss" event={"ID":"22db0f3f-88b5-4909-aa80-f4b020d1ce18","Type":"ContainerStarted","Data":"831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d"} Dec 16 07:19:06 crc kubenswrapper[4823]: I1216 07:19:06.322838 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.323329 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1e69e0bb-4482-4f95-b26e-d129784035d0","Type":"ContainerStarted","Data":"6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594"} Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.324893 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" event={"ID":"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6","Type":"ContainerStarted","Data":"e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c"} Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.326651 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-99f9cf477-cj5ss" event={"ID":"22db0f3f-88b5-4909-aa80-f4b020d1ce18","Type":"ContainerStarted","Data":"cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a"} Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.329660 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerStarted","Data":"45b6fd78d4cc27fc4f505fe84854559b90664fc56cf4eea2db97bbae374da5fa"} Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.349787 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.065151015 podStartE2EDuration="7.349771309s" podCreationTimestamp="2025-12-16 07:19:00 +0000 UTC" firstStartedPulling="2025-12-16 07:19:01.385392074 +0000 UTC m=+1419.873958197" lastFinishedPulling="2025-12-16 07:19:02.670012368 +0000 UTC m=+1421.158578491" observedRunningTime="2025-12-16 07:19:07.343990438 +0000 UTC m=+1425.832556561" watchObservedRunningTime="2025-12-16 07:19:07.349771309 +0000 UTC m=+1425.838337422" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.367541 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" podStartSLOduration=3.464028778 podStartE2EDuration="7.367505575s" podCreationTimestamp="2025-12-16 07:19:00 +0000 UTC" firstStartedPulling="2025-12-16 07:19:01.82841286 +0000 UTC m=+1420.316978983" lastFinishedPulling="2025-12-16 07:19:05.731889657 +0000 UTC m=+1424.220455780" observedRunningTime="2025-12-16 07:19:07.362465837 +0000 UTC m=+1425.851031960" watchObservedRunningTime="2025-12-16 07:19:07.367505575 +0000 UTC m=+1425.856071698" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.382655 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-99f9cf477-cj5ss" podStartSLOduration=3.510673328 podStartE2EDuration="7.382638359s" podCreationTimestamp="2025-12-16 07:19:00 +0000 UTC" firstStartedPulling="2025-12-16 07:19:01.85750537 +0000 UTC m=+1420.346071503" lastFinishedPulling="2025-12-16 07:19:05.729470411 +0000 UTC m=+1424.218036534" observedRunningTime="2025-12-16 07:19:07.380381988 +0000 UTC m=+1425.868948121" watchObservedRunningTime="2025-12-16 07:19:07.382638359 +0000 UTC m=+1425.871204482" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.451057 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6456ccccf4-rhnf4"] Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.452556 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.455048 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.455144 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.463936 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6456ccccf4-rhnf4"] Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.464732 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531355 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-internal-tls-certs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531449 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data-custom\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531496 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzzmh\" (UniqueName: \"kubernetes.io/projected/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-kube-api-access-qzzmh\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531553 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-public-tls-certs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531602 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531652 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-combined-ca-bundle\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.531687 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-logs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633637 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633724 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-combined-ca-bundle\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633752 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-logs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633872 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-internal-tls-certs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633907 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data-custom\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633936 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzzmh\" (UniqueName: \"kubernetes.io/projected/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-kube-api-access-qzzmh\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.633988 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-public-tls-certs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.635350 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-logs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.640955 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data-custom\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.641125 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-combined-ca-bundle\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.655252 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.659491 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-public-tls-certs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.665079 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-internal-tls-certs\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.672602 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzzmh\" (UniqueName: \"kubernetes.io/projected/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-kube-api-access-qzzmh\") pod \"barbican-api-6456ccccf4-rhnf4\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:07 crc kubenswrapper[4823]: I1216 07:19:07.768995 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:08 crc kubenswrapper[4823]: I1216 07:19:08.330584 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6456ccccf4-rhnf4"] Dec 16 07:19:08 crc kubenswrapper[4823]: I1216 07:19:08.348673 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerStarted","Data":"3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20"} Dec 16 07:19:08 crc kubenswrapper[4823]: I1216 07:19:08.350659 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6456ccccf4-rhnf4" event={"ID":"c559ee21-de8f-44a1-998a-cb0b4aff8cd7","Type":"ContainerStarted","Data":"021fd354a518969266cacdeeef782252068339aeb8870177816a95bed2decbec"} Dec 16 07:19:09 crc kubenswrapper[4823]: I1216 07:19:09.397591 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6456ccccf4-rhnf4" event={"ID":"c559ee21-de8f-44a1-998a-cb0b4aff8cd7","Type":"ContainerStarted","Data":"530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec"} Dec 16 07:19:09 crc kubenswrapper[4823]: I1216 07:19:09.398185 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6456ccccf4-rhnf4" event={"ID":"c559ee21-de8f-44a1-998a-cb0b4aff8cd7","Type":"ContainerStarted","Data":"6e803790a094c100a2004f1b22829f8f62d04305a0ff039b94d3de7aaff12828"} Dec 16 07:19:09 crc kubenswrapper[4823]: I1216 07:19:09.399420 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:09 crc kubenswrapper[4823]: I1216 07:19:09.399444 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:09 crc kubenswrapper[4823]: I1216 07:19:09.429661 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6456ccccf4-rhnf4" podStartSLOduration=2.429636601 podStartE2EDuration="2.429636601s" podCreationTimestamp="2025-12-16 07:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:09.415333594 +0000 UTC m=+1427.903899717" watchObservedRunningTime="2025-12-16 07:19:09.429636601 +0000 UTC m=+1427.918202724" Dec 16 07:19:09 crc kubenswrapper[4823]: I1216 07:19:09.988162 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.054320 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-765f8bc948-dqt65"] Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.054588 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-765f8bc948-dqt65" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-api" containerID="cri-o://3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9" gracePeriod=30 Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.054734 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-765f8bc948-dqt65" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-httpd" containerID="cri-o://7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e" gracePeriod=30 Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.429370 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765f8bc948-dqt65" event={"ID":"ebefd0b6-7523-402f-8952-76a232986c74","Type":"ContainerDied","Data":"7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e"} Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.429390 4823 generic.go:334] "Generic (PLEG): container finished" podID="ebefd0b6-7523-402f-8952-76a232986c74" containerID="7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e" exitCode=0 Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.433473 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerStarted","Data":"420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d"} Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.464598 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 07:19:10 crc kubenswrapper[4823]: I1216 07:19:10.700614 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 07:19:11 crc kubenswrapper[4823]: I1216 07:19:11.334186 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:19:11 crc kubenswrapper[4823]: I1216 07:19:11.397506 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685444497c-mb8qw"] Dec 16 07:19:11 crc kubenswrapper[4823]: I1216 07:19:11.397770 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-685444497c-mb8qw" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerName="dnsmasq-dns" containerID="cri-o://6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319" gracePeriod=10 Dec 16 07:19:11 crc kubenswrapper[4823]: I1216 07:19:11.454673 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerStarted","Data":"74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95"} Dec 16 07:19:11 crc kubenswrapper[4823]: I1216 07:19:11.538731 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.136506 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.257308 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-config\") pod \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.257391 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-svc\") pod \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.257424 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-sb\") pod \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.257441 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-nb\") pod \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.257504 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-796k7\" (UniqueName: \"kubernetes.io/projected/c0d41fd9-3e78-4e48-ba89-6acc88459df8-kube-api-access-796k7\") pod \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.257559 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-swift-storage-0\") pod \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\" (UID: \"c0d41fd9-3e78-4e48-ba89-6acc88459df8\") " Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.266290 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d41fd9-3e78-4e48-ba89-6acc88459df8-kube-api-access-796k7" (OuterVolumeSpecName: "kube-api-access-796k7") pod "c0d41fd9-3e78-4e48-ba89-6acc88459df8" (UID: "c0d41fd9-3e78-4e48-ba89-6acc88459df8"). InnerVolumeSpecName "kube-api-access-796k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.303529 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0d41fd9-3e78-4e48-ba89-6acc88459df8" (UID: "c0d41fd9-3e78-4e48-ba89-6acc88459df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.310513 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0d41fd9-3e78-4e48-ba89-6acc88459df8" (UID: "c0d41fd9-3e78-4e48-ba89-6acc88459df8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.328590 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-config" (OuterVolumeSpecName: "config") pod "c0d41fd9-3e78-4e48-ba89-6acc88459df8" (UID: "c0d41fd9-3e78-4e48-ba89-6acc88459df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.343613 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c0d41fd9-3e78-4e48-ba89-6acc88459df8" (UID: "c0d41fd9-3e78-4e48-ba89-6acc88459df8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.348423 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0d41fd9-3e78-4e48-ba89-6acc88459df8" (UID: "c0d41fd9-3e78-4e48-ba89-6acc88459df8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.365348 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.365396 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.365407 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.365418 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.365428 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0d41fd9-3e78-4e48-ba89-6acc88459df8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.365438 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-796k7\" (UniqueName: \"kubernetes.io/projected/c0d41fd9-3e78-4e48-ba89-6acc88459df8-kube-api-access-796k7\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.465748 4823 generic.go:334] "Generic (PLEG): container finished" podID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerID="6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319" exitCode=0 Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.465812 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685444497c-mb8qw" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.465833 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-mb8qw" event={"ID":"c0d41fd9-3e78-4e48-ba89-6acc88459df8","Type":"ContainerDied","Data":"6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319"} Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.465875 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685444497c-mb8qw" event={"ID":"c0d41fd9-3e78-4e48-ba89-6acc88459df8","Type":"ContainerDied","Data":"337742653f05fa5f436d4d079724cbda10debf60e9d8bafa93c82d074e7fe2f4"} Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.465898 4823 scope.go:117] "RemoveContainer" containerID="6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.471159 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerStarted","Data":"34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436"} Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.471228 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="cinder-scheduler" containerID="cri-o://01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd" gracePeriod=30 Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.471306 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="probe" containerID="cri-o://6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594" gracePeriod=30 Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.491860 4823 scope.go:117] "RemoveContainer" containerID="fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.522388 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.897190184 podStartE2EDuration="8.522365736s" podCreationTimestamp="2025-12-16 07:19:04 +0000 UTC" firstStartedPulling="2025-12-16 07:19:06.328247475 +0000 UTC m=+1424.816813598" lastFinishedPulling="2025-12-16 07:19:11.953423027 +0000 UTC m=+1430.441989150" observedRunningTime="2025-12-16 07:19:12.50430247 +0000 UTC m=+1430.992868593" watchObservedRunningTime="2025-12-16 07:19:12.522365736 +0000 UTC m=+1431.010931859" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.535233 4823 scope.go:117] "RemoveContainer" containerID="6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319" Dec 16 07:19:12 crc kubenswrapper[4823]: E1216 07:19:12.536180 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319\": container with ID starting with 6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319 not found: ID does not exist" containerID="6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.536236 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319"} err="failed to get container status \"6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319\": rpc error: code = NotFound desc = could not find container \"6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319\": container with ID starting with 6e7456152f0bd419c54c730ad6957ea312799bd92c55bce3c877a279ab7f4319 not found: ID does not exist" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.536261 4823 scope.go:117] "RemoveContainer" containerID="fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001" Dec 16 07:19:12 crc kubenswrapper[4823]: E1216 07:19:12.538442 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001\": container with ID starting with fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001 not found: ID does not exist" containerID="fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.538475 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001"} err="failed to get container status \"fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001\": rpc error: code = NotFound desc = could not find container \"fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001\": container with ID starting with fc56b1cdd1fc6092ae3891627b24eda58bf5762f233f59e9efb780577d24e001 not found: ID does not exist" Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.546003 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685444497c-mb8qw"] Dec 16 07:19:12 crc kubenswrapper[4823]: I1216 07:19:12.555248 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685444497c-mb8qw"] Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.485078 4823 generic.go:334] "Generic (PLEG): container finished" podID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerID="6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594" exitCode=0 Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.485149 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1e69e0bb-4482-4f95-b26e-d129784035d0","Type":"ContainerDied","Data":"6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594"} Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.485597 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.562458 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.673815 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.784774 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" path="/var/lib/kubelet/pods/c0d41fd9-3e78-4e48-ba89-6acc88459df8/volumes" Dec 16 07:19:13 crc kubenswrapper[4823]: I1216 07:19:13.902664 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 07:19:14 crc kubenswrapper[4823]: I1216 07:19:14.505186 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:19:14 crc kubenswrapper[4823]: I1216 07:19:14.513846 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.084009 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.232403 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data\") pod \"1e69e0bb-4482-4f95-b26e-d129784035d0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.232447 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data-custom\") pod \"1e69e0bb-4482-4f95-b26e-d129784035d0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.232509 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5pcj\" (UniqueName: \"kubernetes.io/projected/1e69e0bb-4482-4f95-b26e-d129784035d0-kube-api-access-n5pcj\") pod \"1e69e0bb-4482-4f95-b26e-d129784035d0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.232599 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e69e0bb-4482-4f95-b26e-d129784035d0-etc-machine-id\") pod \"1e69e0bb-4482-4f95-b26e-d129784035d0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.232628 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-scripts\") pod \"1e69e0bb-4482-4f95-b26e-d129784035d0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.232669 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-combined-ca-bundle\") pod \"1e69e0bb-4482-4f95-b26e-d129784035d0\" (UID: \"1e69e0bb-4482-4f95-b26e-d129784035d0\") " Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.234205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e69e0bb-4482-4f95-b26e-d129784035d0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1e69e0bb-4482-4f95-b26e-d129784035d0" (UID: "1e69e0bb-4482-4f95-b26e-d129784035d0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.247894 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-scripts" (OuterVolumeSpecName: "scripts") pod "1e69e0bb-4482-4f95-b26e-d129784035d0" (UID: "1e69e0bb-4482-4f95-b26e-d129784035d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.252188 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1e69e0bb-4482-4f95-b26e-d129784035d0" (UID: "1e69e0bb-4482-4f95-b26e-d129784035d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.252228 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e69e0bb-4482-4f95-b26e-d129784035d0-kube-api-access-n5pcj" (OuterVolumeSpecName: "kube-api-access-n5pcj") pod "1e69e0bb-4482-4f95-b26e-d129784035d0" (UID: "1e69e0bb-4482-4f95-b26e-d129784035d0"). InnerVolumeSpecName "kube-api-access-n5pcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.292827 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e69e0bb-4482-4f95-b26e-d129784035d0" (UID: "1e69e0bb-4482-4f95-b26e-d129784035d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.341286 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e69e0bb-4482-4f95-b26e-d129784035d0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.341324 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.341336 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.341351 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.341362 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5pcj\" (UniqueName: \"kubernetes.io/projected/1e69e0bb-4482-4f95-b26e-d129784035d0-kube-api-access-n5pcj\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.345213 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data" (OuterVolumeSpecName: "config-data") pod "1e69e0bb-4482-4f95-b26e-d129784035d0" (UID: "1e69e0bb-4482-4f95-b26e-d129784035d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.443659 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e69e0bb-4482-4f95-b26e-d129784035d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.509705 4823 generic.go:334] "Generic (PLEG): container finished" podID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerID="01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd" exitCode=0 Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.509738 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.509761 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1e69e0bb-4482-4f95-b26e-d129784035d0","Type":"ContainerDied","Data":"01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd"} Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.509799 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1e69e0bb-4482-4f95-b26e-d129784035d0","Type":"ContainerDied","Data":"140af84f921b606a0b7c9baa955128fdbd81d52746839b0922c104b816105bd1"} Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.509826 4823 scope.go:117] "RemoveContainer" containerID="6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.540626 4823 scope.go:117] "RemoveContainer" containerID="01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.556726 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.567366 4823 scope.go:117] "RemoveContainer" containerID="6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594" Dec 16 07:19:16 crc kubenswrapper[4823]: E1216 07:19:16.567811 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594\": container with ID starting with 6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594 not found: ID does not exist" containerID="6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.567849 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594"} err="failed to get container status \"6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594\": rpc error: code = NotFound desc = could not find container \"6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594\": container with ID starting with 6b0d9c033657c7fa6a3102ca3db0c867e213117f9b08f450b0469f1ef4f82594 not found: ID does not exist" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.567876 4823 scope.go:117] "RemoveContainer" containerID="01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd" Dec 16 07:19:16 crc kubenswrapper[4823]: E1216 07:19:16.568083 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd\": container with ID starting with 01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd not found: ID does not exist" containerID="01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.568114 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd"} err="failed to get container status \"01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd\": rpc error: code = NotFound desc = could not find container \"01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd\": container with ID starting with 01b3ec6ae5d5b01394c9db938ef8c1223f1e6a09a65c2a48ae44637c46108dfd not found: ID does not exist" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.586213 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595208 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:16 crc kubenswrapper[4823]: E1216 07:19:16.595608 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="cinder-scheduler" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595625 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="cinder-scheduler" Dec 16 07:19:16 crc kubenswrapper[4823]: E1216 07:19:16.595656 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="probe" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595662 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="probe" Dec 16 07:19:16 crc kubenswrapper[4823]: E1216 07:19:16.595673 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerName="init" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595679 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerName="init" Dec 16 07:19:16 crc kubenswrapper[4823]: E1216 07:19:16.595689 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerName="dnsmasq-dns" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595696 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerName="dnsmasq-dns" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595853 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="probe" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595871 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" containerName="cinder-scheduler" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.595884 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d41fd9-3e78-4e48-ba89-6acc88459df8" containerName="dnsmasq-dns" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.596850 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.602261 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.605104 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.748395 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.748799 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.748938 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a27cd126-6c5b-4e95-b313-0bb19568f42a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.749135 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.749161 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.749288 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5sv5\" (UniqueName: \"kubernetes.io/projected/a27cd126-6c5b-4e95-b313-0bb19568f42a-kube-api-access-k5sv5\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.851529 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a27cd126-6c5b-4e95-b313-0bb19568f42a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.851624 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.851628 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a27cd126-6c5b-4e95-b313-0bb19568f42a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.851640 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.851830 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5sv5\" (UniqueName: \"kubernetes.io/projected/a27cd126-6c5b-4e95-b313-0bb19568f42a-kube-api-access-k5sv5\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.851946 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.852007 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.856548 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.860555 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.862905 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.864567 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.884701 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5sv5\" (UniqueName: \"kubernetes.io/projected/a27cd126-6c5b-4e95-b313-0bb19568f42a-kube-api-access-k5sv5\") pod \"cinder-scheduler-0\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " pod="openstack/cinder-scheduler-0" Dec 16 07:19:16 crc kubenswrapper[4823]: I1216 07:19:16.949014 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:19:17 crc kubenswrapper[4823]: I1216 07:19:17.451133 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:19:17 crc kubenswrapper[4823]: I1216 07:19:17.522570 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a27cd126-6c5b-4e95-b313-0bb19568f42a","Type":"ContainerStarted","Data":"55e3e3c97fe64bb8c1f0e0df7efd5f5006ca6ff0ffd6c2588c464f2071ce4177"} Dec 16 07:19:17 crc kubenswrapper[4823]: I1216 07:19:17.793093 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e69e0bb-4482-4f95-b26e-d129784035d0" path="/var/lib/kubelet/pods/1e69e0bb-4482-4f95-b26e-d129784035d0/volumes" Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.515289 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.539340 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a27cd126-6c5b-4e95-b313-0bb19568f42a","Type":"ContainerStarted","Data":"457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02"} Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.539382 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a27cd126-6c5b-4e95-b313-0bb19568f42a","Type":"ContainerStarted","Data":"dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5"} Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.571326 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.571304269 podStartE2EDuration="3.571304269s" podCreationTimestamp="2025-12-16 07:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:19.567098447 +0000 UTC m=+1438.055664580" watchObservedRunningTime="2025-12-16 07:19:19.571304269 +0000 UTC m=+1438.059870412" Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.600860 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.688372 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58d7958684-mfsdc"] Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.688912 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58d7958684-mfsdc" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api" containerID="cri-o://ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e" gracePeriod=30 Dec 16 07:19:19 crc kubenswrapper[4823]: I1216 07:19:19.689205 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58d7958684-mfsdc" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api-log" containerID="cri-o://79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25" gracePeriod=30 Dec 16 07:19:20 crc kubenswrapper[4823]: I1216 07:19:20.559505 4823 generic.go:334] "Generic (PLEG): container finished" podID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerID="79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25" exitCode=143 Dec 16 07:19:20 crc kubenswrapper[4823]: I1216 07:19:20.561462 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d7958684-mfsdc" event={"ID":"472078f0-72f2-4e3e-a626-8a98e3329fe6","Type":"ContainerDied","Data":"79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25"} Dec 16 07:19:21 crc kubenswrapper[4823]: I1216 07:19:21.950544 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 07:19:22 crc kubenswrapper[4823]: I1216 07:19:22.879474 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d7958684-mfsdc" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:52028->10.217.0.156:9311: read: connection reset by peer" Dec 16 07:19:22 crc kubenswrapper[4823]: I1216 07:19:22.879474 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58d7958684-mfsdc" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:52014->10.217.0.156:9311: read: connection reset by peer" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.356127 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.506093 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data\") pod \"472078f0-72f2-4e3e-a626-8a98e3329fe6\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.506171 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-combined-ca-bundle\") pod \"472078f0-72f2-4e3e-a626-8a98e3329fe6\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.506222 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472078f0-72f2-4e3e-a626-8a98e3329fe6-logs\") pod \"472078f0-72f2-4e3e-a626-8a98e3329fe6\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.506286 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2txrv\" (UniqueName: \"kubernetes.io/projected/472078f0-72f2-4e3e-a626-8a98e3329fe6-kube-api-access-2txrv\") pod \"472078f0-72f2-4e3e-a626-8a98e3329fe6\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.506307 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data-custom\") pod \"472078f0-72f2-4e3e-a626-8a98e3329fe6\" (UID: \"472078f0-72f2-4e3e-a626-8a98e3329fe6\") " Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.507061 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472078f0-72f2-4e3e-a626-8a98e3329fe6-logs" (OuterVolumeSpecName: "logs") pod "472078f0-72f2-4e3e-a626-8a98e3329fe6" (UID: "472078f0-72f2-4e3e-a626-8a98e3329fe6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.522247 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472078f0-72f2-4e3e-a626-8a98e3329fe6-kube-api-access-2txrv" (OuterVolumeSpecName: "kube-api-access-2txrv") pod "472078f0-72f2-4e3e-a626-8a98e3329fe6" (UID: "472078f0-72f2-4e3e-a626-8a98e3329fe6"). InnerVolumeSpecName "kube-api-access-2txrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.522306 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "472078f0-72f2-4e3e-a626-8a98e3329fe6" (UID: "472078f0-72f2-4e3e-a626-8a98e3329fe6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.588951 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data" (OuterVolumeSpecName: "config-data") pod "472078f0-72f2-4e3e-a626-8a98e3329fe6" (UID: "472078f0-72f2-4e3e-a626-8a98e3329fe6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.607170 4823 generic.go:334] "Generic (PLEG): container finished" podID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerID="ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e" exitCode=0 Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.607219 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d7958684-mfsdc" event={"ID":"472078f0-72f2-4e3e-a626-8a98e3329fe6","Type":"ContainerDied","Data":"ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e"} Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.607251 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58d7958684-mfsdc" event={"ID":"472078f0-72f2-4e3e-a626-8a98e3329fe6","Type":"ContainerDied","Data":"d7c298a5ca4a6849c10bbae0c2c999f343f42297d5c93e9df241c702c82bd9b4"} Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.607272 4823 scope.go:117] "RemoveContainer" containerID="ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.607434 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58d7958684-mfsdc" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.611936 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472078f0-72f2-4e3e-a626-8a98e3329fe6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.611999 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2txrv\" (UniqueName: \"kubernetes.io/projected/472078f0-72f2-4e3e-a626-8a98e3329fe6-kube-api-access-2txrv\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.612036 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.612051 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.612534 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.624203 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "472078f0-72f2-4e3e-a626-8a98e3329fe6" (UID: "472078f0-72f2-4e3e-a626-8a98e3329fe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.662205 4823 scope.go:117] "RemoveContainer" containerID="79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.717139 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472078f0-72f2-4e3e-a626-8a98e3329fe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.731770 4823 scope.go:117] "RemoveContainer" containerID="ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e" Dec 16 07:19:23 crc kubenswrapper[4823]: E1216 07:19:23.738567 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e\": container with ID starting with ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e not found: ID does not exist" containerID="ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.738617 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e"} err="failed to get container status \"ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e\": rpc error: code = NotFound desc = could not find container \"ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e\": container with ID starting with ad0e619fe597f38298ba6b5425eeda3261561fe3395b804ebbd85c41d6bac42e not found: ID does not exist" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.738643 4823 scope.go:117] "RemoveContainer" containerID="79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25" Dec 16 07:19:23 crc kubenswrapper[4823]: E1216 07:19:23.742497 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25\": container with ID starting with 79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25 not found: ID does not exist" containerID="79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.742526 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25"} err="failed to get container status \"79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25\": rpc error: code = NotFound desc = could not find container \"79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25\": container with ID starting with 79e08e3b403b1c797e52c3704ec66026cfab83c5c450ea72b672827d35c00a25 not found: ID does not exist" Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.938730 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58d7958684-mfsdc"] Dec 16 07:19:23 crc kubenswrapper[4823]: I1216 07:19:23.947598 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-58d7958684-mfsdc"] Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.477696 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 07:19:25 crc kubenswrapper[4823]: E1216 07:19:25.478353 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.478371 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api" Dec 16 07:19:25 crc kubenswrapper[4823]: E1216 07:19:25.478387 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api-log" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.478395 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api-log" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.478574 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.478589 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" containerName="barbican-api-log" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.479193 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.487814 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.487887 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.488187 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wxhfz" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.499499 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.552932 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.553062 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqf6\" (UniqueName: \"kubernetes.io/projected/b5f6144a-70e4-4772-a8d8-2adf38127212-kube-api-access-fqqf6\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.553136 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.553201 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config-secret\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.659662 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqf6\" (UniqueName: \"kubernetes.io/projected/b5f6144a-70e4-4772-a8d8-2adf38127212-kube-api-access-fqqf6\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.659835 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.659923 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config-secret\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.660085 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.662441 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.675960 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.681531 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config-secret\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.708643 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqf6\" (UniqueName: \"kubernetes.io/projected/b5f6144a-70e4-4772-a8d8-2adf38127212-kube-api-access-fqqf6\") pod \"openstackclient\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " pod="openstack/openstackclient" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.793059 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472078f0-72f2-4e3e-a626-8a98e3329fe6" path="/var/lib/kubelet/pods/472078f0-72f2-4e3e-a626-8a98e3329fe6/volumes" Dec 16 07:19:25 crc kubenswrapper[4823]: I1216 07:19:25.820393 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:19:26 crc kubenswrapper[4823]: I1216 07:19:26.357089 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 07:19:26 crc kubenswrapper[4823]: I1216 07:19:26.645386 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b5f6144a-70e4-4772-a8d8-2adf38127212","Type":"ContainerStarted","Data":"3dd0769d8f29041b0cf1367310231aefe0ce4f33754856f33102192f5357821d"} Dec 16 07:19:27 crc kubenswrapper[4823]: I1216 07:19:27.180167 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.101253 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75996b444f-cfsnf"] Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.102692 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.114999 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75996b444f-cfsnf"] Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.115491 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.115543 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.115740 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250356 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbqm\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-kube-api-access-6tbqm\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250423 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-run-httpd\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250540 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-config-data\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250622 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-public-tls-certs\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250643 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-internal-tls-certs\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250714 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-etc-swift\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250773 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.250795 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-log-httpd\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352197 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbqm\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-kube-api-access-6tbqm\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352263 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-run-httpd\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352295 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-config-data\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352352 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-public-tls-certs\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352375 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-internal-tls-certs\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352420 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-etc-swift\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352484 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.352509 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-log-httpd\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.353160 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-log-httpd\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.353722 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-run-httpd\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.360892 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-internal-tls-certs\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.362154 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.365178 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-config-data\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.365665 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-public-tls-certs\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.365737 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-etc-swift\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.373022 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbqm\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-kube-api-access-6tbqm\") pod \"swift-proxy-75996b444f-cfsnf\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:29 crc kubenswrapper[4823]: I1216 07:19:29.447096 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:30 crc kubenswrapper[4823]: I1216 07:19:30.048353 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75996b444f-cfsnf"] Dec 16 07:19:30 crc kubenswrapper[4823]: I1216 07:19:30.713227 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75996b444f-cfsnf" event={"ID":"acfde95a-b68d-4aee-9302-a81c73eafa99","Type":"ContainerStarted","Data":"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386"} Dec 16 07:19:30 crc kubenswrapper[4823]: I1216 07:19:30.713550 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75996b444f-cfsnf" event={"ID":"acfde95a-b68d-4aee-9302-a81c73eafa99","Type":"ContainerStarted","Data":"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d"} Dec 16 07:19:30 crc kubenswrapper[4823]: I1216 07:19:30.713565 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75996b444f-cfsnf" event={"ID":"acfde95a-b68d-4aee-9302-a81c73eafa99","Type":"ContainerStarted","Data":"f4d8edf31a44a0c46340fbdc8bc97c9bb07031f9c938b8d86a94a020fa999433"} Dec 16 07:19:30 crc kubenswrapper[4823]: I1216 07:19:30.739468 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75996b444f-cfsnf" podStartSLOduration=1.739450618 podStartE2EDuration="1.739450618s" podCreationTimestamp="2025-12-16 07:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:30.738308712 +0000 UTC m=+1449.226874835" watchObservedRunningTime="2025-12-16 07:19:30.739450618 +0000 UTC m=+1449.228016741" Dec 16 07:19:31 crc kubenswrapper[4823]: I1216 07:19:31.725375 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:31 crc kubenswrapper[4823]: I1216 07:19:31.725717 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.060515 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.060820 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-central-agent" containerID="cri-o://3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20" gracePeriod=30 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.061649 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="proxy-httpd" containerID="cri-o://34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436" gracePeriod=30 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.061712 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="sg-core" containerID="cri-o://74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95" gracePeriod=30 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.061761 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-notification-agent" containerID="cri-o://420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d" gracePeriod=30 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.071261 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": EOF" Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.736676 4823 generic.go:334] "Generic (PLEG): container finished" podID="421bf187-1b90-4633-9cce-3ef3e0387343" containerID="34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436" exitCode=0 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.737066 4823 generic.go:334] "Generic (PLEG): container finished" podID="421bf187-1b90-4633-9cce-3ef3e0387343" containerID="74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95" exitCode=2 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.737078 4823 generic.go:334] "Generic (PLEG): container finished" podID="421bf187-1b90-4633-9cce-3ef3e0387343" containerID="3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20" exitCode=0 Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.736751 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerDied","Data":"34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436"} Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.737408 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerDied","Data":"74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95"} Dec 16 07:19:32 crc kubenswrapper[4823]: I1216 07:19:32.737443 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerDied","Data":"3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20"} Dec 16 07:19:33 crc kubenswrapper[4823]: I1216 07:19:33.748917 4823 generic.go:334] "Generic (PLEG): container finished" podID="421bf187-1b90-4633-9cce-3ef3e0387343" containerID="420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d" exitCode=0 Dec 16 07:19:33 crc kubenswrapper[4823]: I1216 07:19:33.749040 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerDied","Data":"420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d"} Dec 16 07:19:34 crc kubenswrapper[4823]: I1216 07:19:34.753503 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": dial tcp 10.217.0.157:3000: connect: connection refused" Dec 16 07:19:34 crc kubenswrapper[4823]: I1216 07:19:34.759386 4823 generic.go:334] "Generic (PLEG): container finished" podID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerID="3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb" exitCode=137 Dec 16 07:19:34 crc kubenswrapper[4823]: I1216 07:19:34.759476 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a","Type":"ContainerDied","Data":"3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb"} Dec 16 07:19:35 crc kubenswrapper[4823]: I1216 07:19:35.717177 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.152:8776/healthcheck\": dial tcp 10.217.0.152:8776: connect: connection refused" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.067225 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.215627 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-log-httpd\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.215931 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-run-httpd\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.215995 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-sg-core-conf-yaml\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.216034 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-config-data\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.216063 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-scripts\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.216139 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-combined-ca-bundle\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.216207 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4vmn\" (UniqueName: \"kubernetes.io/projected/421bf187-1b90-4633-9cce-3ef3e0387343-kube-api-access-z4vmn\") pod \"421bf187-1b90-4633-9cce-3ef3e0387343\" (UID: \"421bf187-1b90-4633-9cce-3ef3e0387343\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.216403 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.216880 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.217240 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.221285 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421bf187-1b90-4633-9cce-3ef3e0387343-kube-api-access-z4vmn" (OuterVolumeSpecName: "kube-api-access-z4vmn") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "kube-api-access-z4vmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.221610 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-scripts" (OuterVolumeSpecName: "scripts") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.228747 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.284224 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.327829 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-logs\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.327908 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.327947 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gvfr\" (UniqueName: \"kubernetes.io/projected/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-kube-api-access-6gvfr\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328099 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data-custom\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328188 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-scripts\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328210 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-combined-ca-bundle\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328322 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-etc-machine-id\") pod \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\" (UID: \"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a\") " Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328663 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-logs" (OuterVolumeSpecName: "logs") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328713 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328806 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4vmn\" (UniqueName: \"kubernetes.io/projected/421bf187-1b90-4633-9cce-3ef3e0387343-kube-api-access-z4vmn\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328829 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328842 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/421bf187-1b90-4633-9cce-3ef3e0387343-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328855 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328867 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.328879 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.334861 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-scripts" (OuterVolumeSpecName: "scripts") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.335081 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.357656 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-kube-api-access-6gvfr" (OuterVolumeSpecName: "kube-api-access-6gvfr") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "kube-api-access-6gvfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.431806 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.431863 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.431875 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gvfr\" (UniqueName: \"kubernetes.io/projected/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-kube-api-access-6gvfr\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.438001 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.439579 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.446478 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-765f8bc948-dqt65" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.146:9696/\": dial tcp 10.217.0.146:9696: connect: connection refused" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.452682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data" (OuterVolumeSpecName: "config-data") pod "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" (UID: "c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.481498 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-config-data" (OuterVolumeSpecName: "config-data") pod "421bf187-1b90-4633-9cce-3ef3e0387343" (UID: "421bf187-1b90-4633-9cce-3ef3e0387343"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.533856 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.533907 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.533937 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421bf187-1b90-4633-9cce-3ef3e0387343-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.533950 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.798870 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"421bf187-1b90-4633-9cce-3ef3e0387343","Type":"ContainerDied","Data":"45b6fd78d4cc27fc4f505fe84854559b90664fc56cf4eea2db97bbae374da5fa"} Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.798937 4823 scope.go:117] "RemoveContainer" containerID="34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.798941 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.811697 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a","Type":"ContainerDied","Data":"33edd90111724c4a16cf1c1680a380bb7fd0fd42427c17b0fafdeb431db19842"} Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.811870 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.824173 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b5f6144a-70e4-4772-a8d8-2adf38127212","Type":"ContainerStarted","Data":"f115ec7d425d70b2afcfd5cf1785d9ea4d296e40ca9ff51d30788a90679af605"} Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.835455 4823 scope.go:117] "RemoveContainer" containerID="74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.863157 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.871429 4823 scope.go:117] "RemoveContainer" containerID="420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.896464 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.910406 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.926243 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.930339 4823 scope.go:117] "RemoveContainer" containerID="3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.943706 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:37 crc kubenswrapper[4823]: E1216 07:19:37.944212 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="sg-core" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944238 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="sg-core" Dec 16 07:19:37 crc kubenswrapper[4823]: E1216 07:19:37.944259 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-notification-agent" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944267 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-notification-agent" Dec 16 07:19:37 crc kubenswrapper[4823]: E1216 07:19:37.944284 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944291 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api" Dec 16 07:19:37 crc kubenswrapper[4823]: E1216 07:19:37.944314 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api-log" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944322 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api-log" Dec 16 07:19:37 crc kubenswrapper[4823]: E1216 07:19:37.944331 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-central-agent" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944338 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-central-agent" Dec 16 07:19:37 crc kubenswrapper[4823]: E1216 07:19:37.944361 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="proxy-httpd" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944368 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="proxy-httpd" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944567 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944586 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-central-agent" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944595 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="ceilometer-notification-agent" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944609 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="proxy-httpd" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944630 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" containerName="cinder-api-log" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.944641 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" containerName="sg-core" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.946673 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.953839 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.954068 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.972044 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.973708 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.977318 4823 scope.go:117] "RemoveContainer" containerID="3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.978700 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.392293283 podStartE2EDuration="12.97860925s" podCreationTimestamp="2025-12-16 07:19:25 +0000 UTC" firstStartedPulling="2025-12-16 07:19:26.364639947 +0000 UTC m=+1444.853206070" lastFinishedPulling="2025-12-16 07:19:36.950955904 +0000 UTC m=+1455.439522037" observedRunningTime="2025-12-16 07:19:37.907540714 +0000 UTC m=+1456.396106847" watchObservedRunningTime="2025-12-16 07:19:37.97860925 +0000 UTC m=+1456.467175373" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.984321 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.984546 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 07:19:37 crc kubenswrapper[4823]: I1216 07:19:37.984659 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.006218 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zw7xm"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.007311 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.027400 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.041852 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-scripts\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042257 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-log-httpd\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042400 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042507 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042635 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17cbb31a-6067-4925-ba57-956baf53ce8b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042761 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-config-data\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042885 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.042981 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-scripts\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.043109 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtk9t\" (UniqueName: \"kubernetes.io/projected/17cbb31a-6067-4925-ba57-956baf53ce8b-kube-api-access-vtk9t\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.043226 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data-custom\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.043349 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-run-httpd\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.043454 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274dh\" (UniqueName: \"kubernetes.io/projected/27f18cac-1d41-44f1-b1b1-81cd65e8162a-kube-api-access-274dh\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.043550 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.043634 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.044339 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.044449 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17cbb31a-6067-4925-ba57-956baf53ce8b-logs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.044698 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.052374 4823 scope.go:117] "RemoveContainer" containerID="7df7313948dc8aac9ecdd39743719fe4550166c29436ffebdf4751a6884734bf" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.054431 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zw7xm"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.072384 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f74gj"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.073948 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.107171 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f74gj"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.123341 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c1ba-account-create-update-br8dd"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.124956 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.127553 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.143634 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c1ba-account-create-update-br8dd"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148115 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-scripts\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148169 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-log-httpd\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148221 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148245 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148284 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17cbb31a-6067-4925-ba57-956baf53ce8b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148322 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-config-data\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148355 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx8gx\" (UniqueName: \"kubernetes.io/projected/5df51999-222a-4ef1-a776-5b2c16270039-kube-api-access-qx8gx\") pod \"nova-api-db-create-zw7xm\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148390 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148410 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-scripts\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148426 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17cbb31a-6067-4925-ba57-956baf53ce8b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148443 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtk9t\" (UniqueName: \"kubernetes.io/projected/17cbb31a-6067-4925-ba57-956baf53ce8b-kube-api-access-vtk9t\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148467 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data-custom\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148533 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5df51999-222a-4ef1-a776-5b2c16270039-operator-scripts\") pod \"nova-api-db-create-zw7xm\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148564 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-run-httpd\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148595 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148616 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274dh\" (UniqueName: \"kubernetes.io/projected/27f18cac-1d41-44f1-b1b1-81cd65e8162a-kube-api-access-274dh\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148637 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148674 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt48m\" (UniqueName: \"kubernetes.io/projected/02ea4a50-20c1-4954-8438-520ce44b72a4-kube-api-access-lt48m\") pod \"nova-cell0-db-create-f74gj\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148707 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02ea4a50-20c1-4954-8438-520ce44b72a4-operator-scripts\") pod \"nova-cell0-db-create-f74gj\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148750 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148773 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17cbb31a-6067-4925-ba57-956baf53ce8b-logs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.148861 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-log-httpd\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.149237 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17cbb31a-6067-4925-ba57-956baf53ce8b-logs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.154940 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-run-httpd\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.157232 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-scripts\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.157942 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-scripts\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.158148 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-config-data\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.160416 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.161120 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.161969 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.170764 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.170945 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.171114 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.172794 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data-custom\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.173746 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274dh\" (UniqueName: \"kubernetes.io/projected/27f18cac-1d41-44f1-b1b1-81cd65e8162a-kube-api-access-274dh\") pod \"ceilometer-0\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.184077 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6c77-account-create-update-dktr4"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.184251 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtk9t\" (UniqueName: \"kubernetes.io/projected/17cbb31a-6067-4925-ba57-956baf53ce8b-kube-api-access-vtk9t\") pod \"cinder-api-0\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.185166 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.188285 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.233361 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ms77f"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.237289 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250642 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5df51999-222a-4ef1-a776-5b2c16270039-operator-scripts\") pod \"nova-api-db-create-zw7xm\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250715 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgxm6\" (UniqueName: \"kubernetes.io/projected/3c508895-4490-426b-95d4-47b5a2e871e9-kube-api-access-sgxm6\") pod \"nova-api-c1ba-account-create-update-br8dd\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250737 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7rx\" (UniqueName: \"kubernetes.io/projected/a5a702e1-b24e-4d21-b56a-1e5ec5145565-kube-api-access-lb7rx\") pod \"nova-cell0-6c77-account-create-update-dktr4\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250757 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt48m\" (UniqueName: \"kubernetes.io/projected/02ea4a50-20c1-4954-8438-520ce44b72a4-kube-api-access-lt48m\") pod \"nova-cell0-db-create-f74gj\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250780 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02ea4a50-20c1-4954-8438-520ce44b72a4-operator-scripts\") pod \"nova-cell0-db-create-f74gj\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250857 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a702e1-b24e-4d21-b56a-1e5ec5145565-operator-scripts\") pod \"nova-cell0-6c77-account-create-update-dktr4\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250903 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c508895-4490-426b-95d4-47b5a2e871e9-operator-scripts\") pod \"nova-api-c1ba-account-create-update-br8dd\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.250936 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx8gx\" (UniqueName: \"kubernetes.io/projected/5df51999-222a-4ef1-a776-5b2c16270039-kube-api-access-qx8gx\") pod \"nova-api-db-create-zw7xm\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.251631 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5df51999-222a-4ef1-a776-5b2c16270039-operator-scripts\") pod \"nova-api-db-create-zw7xm\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.260225 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ms77f"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.269879 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.273833 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx8gx\" (UniqueName: \"kubernetes.io/projected/5df51999-222a-4ef1-a776-5b2c16270039-kube-api-access-qx8gx\") pod \"nova-api-db-create-zw7xm\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.277611 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02ea4a50-20c1-4954-8438-520ce44b72a4-operator-scripts\") pod \"nova-cell0-db-create-f74gj\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.279391 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt48m\" (UniqueName: \"kubernetes.io/projected/02ea4a50-20c1-4954-8438-520ce44b72a4-kube-api-access-lt48m\") pod \"nova-cell0-db-create-f74gj\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.296677 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6c77-account-create-update-dktr4"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.309967 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.331705 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.343115 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.343482 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-log" containerID="cri-o://83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a" gracePeriod=30 Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.344043 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-httpd" containerID="cri-o://073a08c446fb9875f9f26912e0877ed5083484ef1b445236e4f1bb03cdf07728" gracePeriod=30 Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.354257 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgxm6\" (UniqueName: \"kubernetes.io/projected/3c508895-4490-426b-95d4-47b5a2e871e9-kube-api-access-sgxm6\") pod \"nova-api-c1ba-account-create-update-br8dd\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.354297 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7rx\" (UniqueName: \"kubernetes.io/projected/a5a702e1-b24e-4d21-b56a-1e5ec5145565-kube-api-access-lb7rx\") pod \"nova-cell0-6c77-account-create-update-dktr4\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.354383 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23090877-6b52-4bf9-8272-0a3146fb5e70-operator-scripts\") pod \"nova-cell1-db-create-ms77f\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.354438 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a702e1-b24e-4d21-b56a-1e5ec5145565-operator-scripts\") pod \"nova-cell0-6c77-account-create-update-dktr4\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.354473 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxrn\" (UniqueName: \"kubernetes.io/projected/23090877-6b52-4bf9-8272-0a3146fb5e70-kube-api-access-rbxrn\") pod \"nova-cell1-db-create-ms77f\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.354523 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c508895-4490-426b-95d4-47b5a2e871e9-operator-scripts\") pod \"nova-api-c1ba-account-create-update-br8dd\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.355270 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c508895-4490-426b-95d4-47b5a2e871e9-operator-scripts\") pod \"nova-api-c1ba-account-create-update-br8dd\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.356405 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a702e1-b24e-4d21-b56a-1e5ec5145565-operator-scripts\") pod \"nova-cell0-6c77-account-create-update-dktr4\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.356458 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-15f7-account-create-update-tdlw5"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.357826 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.361358 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.370830 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-15f7-account-create-update-tdlw5"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.379557 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7rx\" (UniqueName: \"kubernetes.io/projected/a5a702e1-b24e-4d21-b56a-1e5ec5145565-kube-api-access-lb7rx\") pod \"nova-cell0-6c77-account-create-update-dktr4\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.388167 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgxm6\" (UniqueName: \"kubernetes.io/projected/3c508895-4490-426b-95d4-47b5a2e871e9-kube-api-access-sgxm6\") pod \"nova-api-c1ba-account-create-update-br8dd\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.397921 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.401539 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.444443 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.456281 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxrn\" (UniqueName: \"kubernetes.io/projected/23090877-6b52-4bf9-8272-0a3146fb5e70-kube-api-access-rbxrn\") pod \"nova-cell1-db-create-ms77f\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.456349 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwq4c\" (UniqueName: \"kubernetes.io/projected/119de702-bd92-49d3-8bef-ba0fd81637c2-kube-api-access-kwq4c\") pod \"nova-cell1-15f7-account-create-update-tdlw5\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.456478 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23090877-6b52-4bf9-8272-0a3146fb5e70-operator-scripts\") pod \"nova-cell1-db-create-ms77f\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.456496 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119de702-bd92-49d3-8bef-ba0fd81637c2-operator-scripts\") pod \"nova-cell1-15f7-account-create-update-tdlw5\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.457611 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23090877-6b52-4bf9-8272-0a3146fb5e70-operator-scripts\") pod \"nova-cell1-db-create-ms77f\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.478678 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxrn\" (UniqueName: \"kubernetes.io/projected/23090877-6b52-4bf9-8272-0a3146fb5e70-kube-api-access-rbxrn\") pod \"nova-cell1-db-create-ms77f\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.557850 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119de702-bd92-49d3-8bef-ba0fd81637c2-operator-scripts\") pod \"nova-cell1-15f7-account-create-update-tdlw5\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.558305 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwq4c\" (UniqueName: \"kubernetes.io/projected/119de702-bd92-49d3-8bef-ba0fd81637c2-kube-api-access-kwq4c\") pod \"nova-cell1-15f7-account-create-update-tdlw5\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.559310 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119de702-bd92-49d3-8bef-ba0fd81637c2-operator-scripts\") pod \"nova-cell1-15f7-account-create-update-tdlw5\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.585620 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwq4c\" (UniqueName: \"kubernetes.io/projected/119de702-bd92-49d3-8bef-ba0fd81637c2-kube-api-access-kwq4c\") pod \"nova-cell1-15f7-account-create-update-tdlw5\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.759634 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.777591 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.875693 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.882490 4823 generic.go:334] "Generic (PLEG): container finished" podID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerID="83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a" exitCode=143 Dec 16 07:19:38 crc kubenswrapper[4823]: I1216 07:19:38.882595 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"efa3cd8b-aa5f-4769-a8aa-801716fa389c","Type":"ContainerDied","Data":"83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a"} Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.100378 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zw7xm"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.291420 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c1ba-account-create-update-br8dd"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.428078 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.439694 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f74gj"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.448419 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6c77-account-create-update-dktr4"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.455475 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.462454 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.509393 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ms77f"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.591472 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-15f7-account-create-update-tdlw5"] Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.798115 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421bf187-1b90-4633-9cce-3ef3e0387343" path="/var/lib/kubelet/pods/421bf187-1b90-4633-9cce-3ef3e0387343/volumes" Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.800518 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a" path="/var/lib/kubelet/pods/c8dc4eee-95c9-4460-84d3-fc8c18a8ab1a/volumes" Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.986821 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" event={"ID":"a5a702e1-b24e-4d21-b56a-1e5ec5145565","Type":"ContainerStarted","Data":"ed8da51b1f401f8f4e2d7a7d7452b6f625aac4f2e92ebf487be96e296cef532b"} Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.986870 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" event={"ID":"a5a702e1-b24e-4d21-b56a-1e5ec5145565","Type":"ContainerStarted","Data":"68e1eb7dfa5b448dbd50aac8dc0b393c10fa5cd4662aedd23e1025e9a3f74530"} Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.989732 4823 generic.go:334] "Generic (PLEG): container finished" podID="5df51999-222a-4ef1-a776-5b2c16270039" containerID="7e146bbc79bbe9eb68a312975f35b67d90430fd0786d945860fd8ab6a984eb53" exitCode=0 Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.989804 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zw7xm" event={"ID":"5df51999-222a-4ef1-a776-5b2c16270039","Type":"ContainerDied","Data":"7e146bbc79bbe9eb68a312975f35b67d90430fd0786d945860fd8ab6a984eb53"} Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.989834 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zw7xm" event={"ID":"5df51999-222a-4ef1-a776-5b2c16270039","Type":"ContainerStarted","Data":"dd51519ca27adfa2a8d67ed5f58976f1781a2c38430dee84462f288e1f37c6a5"} Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.991465 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17cbb31a-6067-4925-ba57-956baf53ce8b","Type":"ContainerStarted","Data":"7d28597c75c0d7c63dc80fe3f8ba2359f5560ee9f5ba2768db65afd9ec3f19c7"} Dec 16 07:19:39 crc kubenswrapper[4823]: I1216 07:19:39.996054 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerStarted","Data":"ce3aeda0c823fa13b031e1d58c32c8239a4c0c36a73f16fdd4ca6544c4f6065b"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.011075 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f74gj" event={"ID":"02ea4a50-20c1-4954-8438-520ce44b72a4","Type":"ContainerStarted","Data":"16a95f2b7fef319066cab4f1313e0adcc10ee53441d3966332ef54251e1bdd00"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.011125 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f74gj" event={"ID":"02ea4a50-20c1-4954-8438-520ce44b72a4","Type":"ContainerStarted","Data":"1c1cbdd3d307f40f7484a4afbf6c5f1c12adf3f5c80de9e53634f26f97053cf5"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.035108 4823 generic.go:334] "Generic (PLEG): container finished" podID="3c508895-4490-426b-95d4-47b5a2e871e9" containerID="77d605fe574dc720c6f8c4e19ea7723f3ce2ff5404d8309c4b773329eab3bced" exitCode=0 Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.035210 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c1ba-account-create-update-br8dd" event={"ID":"3c508895-4490-426b-95d4-47b5a2e871e9","Type":"ContainerDied","Data":"77d605fe574dc720c6f8c4e19ea7723f3ce2ff5404d8309c4b773329eab3bced"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.035246 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c1ba-account-create-update-br8dd" event={"ID":"3c508895-4490-426b-95d4-47b5a2e871e9","Type":"ContainerStarted","Data":"2638beb15644520d560dfa13ca3916aa7ff0772a1853513cb7f859f8fcfad3d5"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.041364 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" podStartSLOduration=2.041336184 podStartE2EDuration="2.041336184s" podCreationTimestamp="2025-12-16 07:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:40.017612552 +0000 UTC m=+1458.506178675" watchObservedRunningTime="2025-12-16 07:19:40.041336184 +0000 UTC m=+1458.529902317" Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.054381 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ms77f" event={"ID":"23090877-6b52-4bf9-8272-0a3146fb5e70","Type":"ContainerStarted","Data":"f710f8ddca7bba7b7c064ebf5f57928e26b3e5f08ca8cbef0029d84ebbf3c47d"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.057772 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" event={"ID":"119de702-bd92-49d3-8bef-ba0fd81637c2","Type":"ContainerStarted","Data":"532bf462e0fbaf66e9461842d079d4e81163b5f703c6431f6d87090a0cad20d4"} Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.066232 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-f74gj" podStartSLOduration=3.066208413 podStartE2EDuration="3.066208413s" podCreationTimestamp="2025-12-16 07:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:40.049327925 +0000 UTC m=+1458.537894048" watchObservedRunningTime="2025-12-16 07:19:40.066208413 +0000 UTC m=+1458.554774546" Dec 16 07:19:40 crc kubenswrapper[4823]: E1216 07:19:40.411587 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-conmon-3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8dc4eee_95c9_4460_84d3_fc8c18a8ab1a.slice/crio-33edd90111724c4a16cf1c1680a380bb7fd0fd42427c17b0fafdeb431db19842\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefa3cd8b_aa5f_4769_a8aa_801716fa389c.slice/crio-conmon-83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-conmon-420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8dc4eee_95c9_4460_84d3_fc8c18a8ab1a.slice/crio-3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-3718e34b1fc21b42d98d69728a16b761c782e2c2e37d66729e379687f6b6da20.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebefd0b6_7523_402f_8952_76a232986c74.slice/crio-3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-conmon-34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-conmon-74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8dc4eee_95c9_4460_84d3_fc8c18a8ab1a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefa3cd8b_aa5f_4769_a8aa_801716fa389c.slice/crio-83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebefd0b6_7523_402f_8952_76a232986c74.slice/crio-conmon-3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-34d5d59e1e9cf6b86e8a339b71d69ac7e017e6031c70c62ac5f611787a5f2436.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-45b6fd78d4cc27fc4f505fe84854559b90664fc56cf4eea2db97bbae374da5fa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-420a315d161117af966330608a0704e9cdc7e630c2109694cd4dafa6bf4f0f1d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8dc4eee_95c9_4460_84d3_fc8c18a8ab1a.slice/crio-conmon-3bbb7d0b518486c9da08c3aa98624cbf02668f6a757f1b21cd693ea4f2ba78eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421bf187_1b90_4633_9cce_3ef3e0387343.slice/crio-74d6b3f6906669a91552701733b0d73db38ce2772a9816834e6e528c11bbfe95.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.816475 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-765f8bc948-dqt65_ebefd0b6-7523-402f-8952-76a232986c74/neutron-api/0.log" Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.816559 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.927269 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7dzc\" (UniqueName: \"kubernetes.io/projected/ebefd0b6-7523-402f-8952-76a232986c74-kube-api-access-s7dzc\") pod \"ebefd0b6-7523-402f-8952-76a232986c74\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.927646 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-combined-ca-bundle\") pod \"ebefd0b6-7523-402f-8952-76a232986c74\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.927713 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-ovndb-tls-certs\") pod \"ebefd0b6-7523-402f-8952-76a232986c74\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.927830 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-config\") pod \"ebefd0b6-7523-402f-8952-76a232986c74\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.927866 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-httpd-config\") pod \"ebefd0b6-7523-402f-8952-76a232986c74\" (UID: \"ebefd0b6-7523-402f-8952-76a232986c74\") " Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.934142 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ebefd0b6-7523-402f-8952-76a232986c74" (UID: "ebefd0b6-7523-402f-8952-76a232986c74"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:40 crc kubenswrapper[4823]: I1216 07:19:40.934165 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebefd0b6-7523-402f-8952-76a232986c74-kube-api-access-s7dzc" (OuterVolumeSpecName: "kube-api-access-s7dzc") pod "ebefd0b6-7523-402f-8952-76a232986c74" (UID: "ebefd0b6-7523-402f-8952-76a232986c74"). InnerVolumeSpecName "kube-api-access-s7dzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.001779 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-config" (OuterVolumeSpecName: "config") pod "ebefd0b6-7523-402f-8952-76a232986c74" (UID: "ebefd0b6-7523-402f-8952-76a232986c74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.029566 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.029632 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.029643 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7dzc\" (UniqueName: \"kubernetes.io/projected/ebefd0b6-7523-402f-8952-76a232986c74-kube-api-access-s7dzc\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.032435 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebefd0b6-7523-402f-8952-76a232986c74" (UID: "ebefd0b6-7523-402f-8952-76a232986c74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.047354 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ebefd0b6-7523-402f-8952-76a232986c74" (UID: "ebefd0b6-7523-402f-8952-76a232986c74"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.082997 4823 generic.go:334] "Generic (PLEG): container finished" podID="02ea4a50-20c1-4954-8438-520ce44b72a4" containerID="16a95f2b7fef319066cab4f1313e0adcc10ee53441d3966332ef54251e1bdd00" exitCode=0 Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.083080 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f74gj" event={"ID":"02ea4a50-20c1-4954-8438-520ce44b72a4","Type":"ContainerDied","Data":"16a95f2b7fef319066cab4f1313e0adcc10ee53441d3966332ef54251e1bdd00"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.088290 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-765f8bc948-dqt65_ebefd0b6-7523-402f-8952-76a232986c74/neutron-api/0.log" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.088351 4823 generic.go:334] "Generic (PLEG): container finished" podID="ebefd0b6-7523-402f-8952-76a232986c74" containerID="3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9" exitCode=137 Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.088436 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765f8bc948-dqt65" event={"ID":"ebefd0b6-7523-402f-8952-76a232986c74","Type":"ContainerDied","Data":"3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.088468 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-765f8bc948-dqt65" event={"ID":"ebefd0b6-7523-402f-8952-76a232986c74","Type":"ContainerDied","Data":"2520c01c87155de5f3e9ff16bc69a0c668d788a5f4eeb5e9a10be09df6651154"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.088489 4823 scope.go:117] "RemoveContainer" containerID="7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.088642 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-765f8bc948-dqt65" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.108282 4823 generic.go:334] "Generic (PLEG): container finished" podID="23090877-6b52-4bf9-8272-0a3146fb5e70" containerID="33c1c84e6505bf5e60cb15c74c9530a062509d13ccde74bcf09a73dbf725eeee" exitCode=0 Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.108375 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ms77f" event={"ID":"23090877-6b52-4bf9-8272-0a3146fb5e70","Type":"ContainerDied","Data":"33c1c84e6505bf5e60cb15c74c9530a062509d13ccde74bcf09a73dbf725eeee"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.117820 4823 generic.go:334] "Generic (PLEG): container finished" podID="119de702-bd92-49d3-8bef-ba0fd81637c2" containerID="04cfc7ac2370fcc670b0a4d36151d595decad0d56f5fab56594a90ea3fd9eb05" exitCode=0 Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.117911 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" event={"ID":"119de702-bd92-49d3-8bef-ba0fd81637c2","Type":"ContainerDied","Data":"04cfc7ac2370fcc670b0a4d36151d595decad0d56f5fab56594a90ea3fd9eb05"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.125130 4823 generic.go:334] "Generic (PLEG): container finished" podID="a5a702e1-b24e-4d21-b56a-1e5ec5145565" containerID="ed8da51b1f401f8f4e2d7a7d7452b6f625aac4f2e92ebf487be96e296cef532b" exitCode=0 Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.125214 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" event={"ID":"a5a702e1-b24e-4d21-b56a-1e5ec5145565","Type":"ContainerDied","Data":"ed8da51b1f401f8f4e2d7a7d7452b6f625aac4f2e92ebf487be96e296cef532b"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.131664 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.131704 4823 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebefd0b6-7523-402f-8952-76a232986c74-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.152573 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17cbb31a-6067-4925-ba57-956baf53ce8b","Type":"ContainerStarted","Data":"51565ca562af3db8782b4b38fb1d3b09a6c7f19f5c5020ef8e0d0b046831c28d"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.161251 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-765f8bc948-dqt65"] Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.164481 4823 scope.go:117] "RemoveContainer" containerID="3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.166850 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerStarted","Data":"0ce37b9e2d682bff4ed7a343afd322004978b019c106114a737cb0273299956f"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.166896 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerStarted","Data":"0554ea0464e696023a5a79ebbbad46e3a5e0f3f41b1854665b94758071ee63c6"} Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.174198 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-765f8bc948-dqt65"] Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.242643 4823 scope.go:117] "RemoveContainer" containerID="7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e" Dec 16 07:19:41 crc kubenswrapper[4823]: E1216 07:19:41.243254 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e\": container with ID starting with 7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e not found: ID does not exist" containerID="7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.243289 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e"} err="failed to get container status \"7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e\": rpc error: code = NotFound desc = could not find container \"7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e\": container with ID starting with 7e1578371b4d6a145919aebf10c8fa0a868a73fb853b572714a607e7dd2c094e not found: ID does not exist" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.243325 4823 scope.go:117] "RemoveContainer" containerID="3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9" Dec 16 07:19:41 crc kubenswrapper[4823]: E1216 07:19:41.243968 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9\": container with ID starting with 3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9 not found: ID does not exist" containerID="3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.244012 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9"} err="failed to get container status \"3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9\": rpc error: code = NotFound desc = could not find container \"3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9\": container with ID starting with 3eb704ae25fe49bca8f72c9dba890b0568acbb78194d883488897c4a6cc39dd9 not found: ID does not exist" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.545333 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.640830 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5df51999-222a-4ef1-a776-5b2c16270039-operator-scripts\") pod \"5df51999-222a-4ef1-a776-5b2c16270039\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.641052 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx8gx\" (UniqueName: \"kubernetes.io/projected/5df51999-222a-4ef1-a776-5b2c16270039-kube-api-access-qx8gx\") pod \"5df51999-222a-4ef1-a776-5b2c16270039\" (UID: \"5df51999-222a-4ef1-a776-5b2c16270039\") " Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.648743 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5df51999-222a-4ef1-a776-5b2c16270039-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5df51999-222a-4ef1-a776-5b2c16270039" (UID: "5df51999-222a-4ef1-a776-5b2c16270039"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.650258 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df51999-222a-4ef1-a776-5b2c16270039-kube-api-access-qx8gx" (OuterVolumeSpecName: "kube-api-access-qx8gx") pod "5df51999-222a-4ef1-a776-5b2c16270039" (UID: "5df51999-222a-4ef1-a776-5b2c16270039"). InnerVolumeSpecName "kube-api-access-qx8gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.708644 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.743964 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5df51999-222a-4ef1-a776-5b2c16270039-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.744002 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx8gx\" (UniqueName: \"kubernetes.io/projected/5df51999-222a-4ef1-a776-5b2c16270039-kube-api-access-qx8gx\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.793150 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebefd0b6-7523-402f-8952-76a232986c74" path="/var/lib/kubelet/pods/ebefd0b6-7523-402f-8952-76a232986c74/volumes" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.845192 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgxm6\" (UniqueName: \"kubernetes.io/projected/3c508895-4490-426b-95d4-47b5a2e871e9-kube-api-access-sgxm6\") pod \"3c508895-4490-426b-95d4-47b5a2e871e9\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.845305 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c508895-4490-426b-95d4-47b5a2e871e9-operator-scripts\") pod \"3c508895-4490-426b-95d4-47b5a2e871e9\" (UID: \"3c508895-4490-426b-95d4-47b5a2e871e9\") " Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.850410 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c508895-4490-426b-95d4-47b5a2e871e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c508895-4490-426b-95d4-47b5a2e871e9" (UID: "3c508895-4490-426b-95d4-47b5a2e871e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.863860 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c508895-4490-426b-95d4-47b5a2e871e9-kube-api-access-sgxm6" (OuterVolumeSpecName: "kube-api-access-sgxm6") pod "3c508895-4490-426b-95d4-47b5a2e871e9" (UID: "3c508895-4490-426b-95d4-47b5a2e871e9"). InnerVolumeSpecName "kube-api-access-sgxm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.866056 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgxm6\" (UniqueName: \"kubernetes.io/projected/3c508895-4490-426b-95d4-47b5a2e871e9-kube-api-access-sgxm6\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:41 crc kubenswrapper[4823]: I1216 07:19:41.866095 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c508895-4490-426b-95d4-47b5a2e871e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.005911 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.006159 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-log" containerID="cri-o://0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4" gracePeriod=30 Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.006446 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-httpd" containerID="cri-o://f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4" gracePeriod=30 Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.182624 4823 generic.go:334] "Generic (PLEG): container finished" podID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerID="0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4" exitCode=143 Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.182950 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80","Type":"ContainerDied","Data":"0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4"} Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.190202 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zw7xm" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.190216 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zw7xm" event={"ID":"5df51999-222a-4ef1-a776-5b2c16270039","Type":"ContainerDied","Data":"dd51519ca27adfa2a8d67ed5f58976f1781a2c38430dee84462f288e1f37c6a5"} Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.190263 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd51519ca27adfa2a8d67ed5f58976f1781a2c38430dee84462f288e1f37c6a5" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.192689 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17cbb31a-6067-4925-ba57-956baf53ce8b","Type":"ContainerStarted","Data":"a2e711057ef9e93e470930a37179c721716096884ec2356c0cc2c2d27a2dddf4"} Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.193756 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.199086 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c1ba-account-create-update-br8dd" event={"ID":"3c508895-4490-426b-95d4-47b5a2e871e9","Type":"ContainerDied","Data":"2638beb15644520d560dfa13ca3916aa7ff0772a1853513cb7f859f8fcfad3d5"} Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.199131 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2638beb15644520d560dfa13ca3916aa7ff0772a1853513cb7f859f8fcfad3d5" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.199212 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ba-account-create-update-br8dd" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.204253 4823 generic.go:334] "Generic (PLEG): container finished" podID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerID="073a08c446fb9875f9f26912e0877ed5083484ef1b445236e4f1bb03cdf07728" exitCode=0 Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.204461 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"efa3cd8b-aa5f-4769-a8aa-801716fa389c","Type":"ContainerDied","Data":"073a08c446fb9875f9f26912e0877ed5083484ef1b445236e4f1bb03cdf07728"} Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.233752 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.233733181 podStartE2EDuration="5.233733181s" podCreationTimestamp="2025-12-16 07:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:42.211928828 +0000 UTC m=+1460.700494951" watchObservedRunningTime="2025-12-16 07:19:42.233733181 +0000 UTC m=+1460.722299314" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.397449 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.461623 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481083 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-config-data\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481192 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-logs\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481272 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-scripts\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481395 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-httpd-run\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481448 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-public-tls-certs\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481483 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481528 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-combined-ca-bundle\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.481565 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg5ss\" (UniqueName: \"kubernetes.io/projected/efa3cd8b-aa5f-4769-a8aa-801716fa389c-kube-api-access-jg5ss\") pod \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\" (UID: \"efa3cd8b-aa5f-4769-a8aa-801716fa389c\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.482321 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.484088 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-logs" (OuterVolumeSpecName: "logs") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.487897 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-scripts" (OuterVolumeSpecName: "scripts") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.487961 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa3cd8b-aa5f-4769-a8aa-801716fa389c-kube-api-access-jg5ss" (OuterVolumeSpecName: "kube-api-access-jg5ss") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "kube-api-access-jg5ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.490644 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.552071 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.584530 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.584564 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.584572 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efa3cd8b-aa5f-4769-a8aa-801716fa389c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.584601 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.584611 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.584621 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg5ss\" (UniqueName: \"kubernetes.io/projected/efa3cd8b-aa5f-4769-a8aa-801716fa389c-kube-api-access-jg5ss\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.594919 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.605604 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.609053 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-config-data" (OuterVolumeSpecName: "config-data") pod "efa3cd8b-aa5f-4769-a8aa-801716fa389c" (UID: "efa3cd8b-aa5f-4769-a8aa-801716fa389c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.660272 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.686144 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.686183 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.686196 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa3cd8b-aa5f-4769-a8aa-801716fa389c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.787579 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb7rx\" (UniqueName: \"kubernetes.io/projected/a5a702e1-b24e-4d21-b56a-1e5ec5145565-kube-api-access-lb7rx\") pod \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.787838 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a702e1-b24e-4d21-b56a-1e5ec5145565-operator-scripts\") pod \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\" (UID: \"a5a702e1-b24e-4d21-b56a-1e5ec5145565\") " Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.788746 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a702e1-b24e-4d21-b56a-1e5ec5145565-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5a702e1-b24e-4d21-b56a-1e5ec5145565" (UID: "a5a702e1-b24e-4d21-b56a-1e5ec5145565"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.817255 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a702e1-b24e-4d21-b56a-1e5ec5145565-kube-api-access-lb7rx" (OuterVolumeSpecName: "kube-api-access-lb7rx") pod "a5a702e1-b24e-4d21-b56a-1e5ec5145565" (UID: "a5a702e1-b24e-4d21-b56a-1e5ec5145565"). InnerVolumeSpecName "kube-api-access-lb7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.891431 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a702e1-b24e-4d21-b56a-1e5ec5145565-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.891473 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb7rx\" (UniqueName: \"kubernetes.io/projected/a5a702e1-b24e-4d21-b56a-1e5ec5145565-kube-api-access-lb7rx\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:42 crc kubenswrapper[4823]: I1216 07:19:42.989125 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.024603 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.026947 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.095117 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxrn\" (UniqueName: \"kubernetes.io/projected/23090877-6b52-4bf9-8272-0a3146fb5e70-kube-api-access-rbxrn\") pod \"23090877-6b52-4bf9-8272-0a3146fb5e70\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.095216 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwq4c\" (UniqueName: \"kubernetes.io/projected/119de702-bd92-49d3-8bef-ba0fd81637c2-kube-api-access-kwq4c\") pod \"119de702-bd92-49d3-8bef-ba0fd81637c2\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.095305 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02ea4a50-20c1-4954-8438-520ce44b72a4-operator-scripts\") pod \"02ea4a50-20c1-4954-8438-520ce44b72a4\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.095331 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23090877-6b52-4bf9-8272-0a3146fb5e70-operator-scripts\") pod \"23090877-6b52-4bf9-8272-0a3146fb5e70\" (UID: \"23090877-6b52-4bf9-8272-0a3146fb5e70\") " Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.095356 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119de702-bd92-49d3-8bef-ba0fd81637c2-operator-scripts\") pod \"119de702-bd92-49d3-8bef-ba0fd81637c2\" (UID: \"119de702-bd92-49d3-8bef-ba0fd81637c2\") " Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.095489 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt48m\" (UniqueName: \"kubernetes.io/projected/02ea4a50-20c1-4954-8438-520ce44b72a4-kube-api-access-lt48m\") pod \"02ea4a50-20c1-4954-8438-520ce44b72a4\" (UID: \"02ea4a50-20c1-4954-8438-520ce44b72a4\") " Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.096586 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ea4a50-20c1-4954-8438-520ce44b72a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02ea4a50-20c1-4954-8438-520ce44b72a4" (UID: "02ea4a50-20c1-4954-8438-520ce44b72a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.100661 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119de702-bd92-49d3-8bef-ba0fd81637c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "119de702-bd92-49d3-8bef-ba0fd81637c2" (UID: "119de702-bd92-49d3-8bef-ba0fd81637c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.100730 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23090877-6b52-4bf9-8272-0a3146fb5e70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23090877-6b52-4bf9-8272-0a3146fb5e70" (UID: "23090877-6b52-4bf9-8272-0a3146fb5e70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.100900 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119de702-bd92-49d3-8bef-ba0fd81637c2-kube-api-access-kwq4c" (OuterVolumeSpecName: "kube-api-access-kwq4c") pod "119de702-bd92-49d3-8bef-ba0fd81637c2" (UID: "119de702-bd92-49d3-8bef-ba0fd81637c2"). InnerVolumeSpecName "kube-api-access-kwq4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.105132 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ea4a50-20c1-4954-8438-520ce44b72a4-kube-api-access-lt48m" (OuterVolumeSpecName: "kube-api-access-lt48m") pod "02ea4a50-20c1-4954-8438-520ce44b72a4" (UID: "02ea4a50-20c1-4954-8438-520ce44b72a4"). InnerVolumeSpecName "kube-api-access-lt48m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.111440 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23090877-6b52-4bf9-8272-0a3146fb5e70-kube-api-access-rbxrn" (OuterVolumeSpecName: "kube-api-access-rbxrn") pod "23090877-6b52-4bf9-8272-0a3146fb5e70" (UID: "23090877-6b52-4bf9-8272-0a3146fb5e70"). InnerVolumeSpecName "kube-api-access-rbxrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.197551 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02ea4a50-20c1-4954-8438-520ce44b72a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.197601 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23090877-6b52-4bf9-8272-0a3146fb5e70-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.197614 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119de702-bd92-49d3-8bef-ba0fd81637c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.197625 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt48m\" (UniqueName: \"kubernetes.io/projected/02ea4a50-20c1-4954-8438-520ce44b72a4-kube-api-access-lt48m\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.197638 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxrn\" (UniqueName: \"kubernetes.io/projected/23090877-6b52-4bf9-8272-0a3146fb5e70-kube-api-access-rbxrn\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.197651 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwq4c\" (UniqueName: \"kubernetes.io/projected/119de702-bd92-49d3-8bef-ba0fd81637c2-kube-api-access-kwq4c\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.223374 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f74gj" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.223373 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f74gj" event={"ID":"02ea4a50-20c1-4954-8438-520ce44b72a4","Type":"ContainerDied","Data":"1c1cbdd3d307f40f7484a4afbf6c5f1c12adf3f5c80de9e53634f26f97053cf5"} Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.223443 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1cbdd3d307f40f7484a4afbf6c5f1c12adf3f5c80de9e53634f26f97053cf5" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.224739 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ms77f" event={"ID":"23090877-6b52-4bf9-8272-0a3146fb5e70","Type":"ContainerDied","Data":"f710f8ddca7bba7b7c064ebf5f57928e26b3e5f08ca8cbef0029d84ebbf3c47d"} Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.224785 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ms77f" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.224819 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f710f8ddca7bba7b7c064ebf5f57928e26b3e5f08ca8cbef0029d84ebbf3c47d" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.226404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"efa3cd8b-aa5f-4769-a8aa-801716fa389c","Type":"ContainerDied","Data":"1b8beada80ec38510530fbd0b46a6f089e5e02413dd039fc19808fe329678851"} Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.226442 4823 scope.go:117] "RemoveContainer" containerID="073a08c446fb9875f9f26912e0877ed5083484ef1b445236e4f1bb03cdf07728" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.226585 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.230521 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.230517 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-15f7-account-create-update-tdlw5" event={"ID":"119de702-bd92-49d3-8bef-ba0fd81637c2","Type":"ContainerDied","Data":"532bf462e0fbaf66e9461842d079d4e81163b5f703c6431f6d87090a0cad20d4"} Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.230682 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="532bf462e0fbaf66e9461842d079d4e81163b5f703c6431f6d87090a0cad20d4" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.235542 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.235571 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c77-account-create-update-dktr4" event={"ID":"a5a702e1-b24e-4d21-b56a-1e5ec5145565","Type":"ContainerDied","Data":"68e1eb7dfa5b448dbd50aac8dc0b393c10fa5cd4662aedd23e1025e9a3f74530"} Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.235618 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e1eb7dfa5b448dbd50aac8dc0b393c10fa5cd4662aedd23e1025e9a3f74530" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.237829 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerStarted","Data":"dabbb904de8166a0f13fd44ee28a44dd660742c437cd4663b17be2496b0e5c52"} Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.259561 4823 scope.go:117] "RemoveContainer" containerID="83a3257ecbd5e248b7007b2c0b4e4b4f18d9a35aa4a2da2baaa67699c0eaf10a" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.299042 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.322304 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334287 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334659 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c508895-4490-426b-95d4-47b5a2e871e9" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334678 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c508895-4490-426b-95d4-47b5a2e871e9" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334695 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-httpd" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334701 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-httpd" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334712 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ea4a50-20c1-4954-8438-520ce44b72a4" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334719 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ea4a50-20c1-4954-8438-520ce44b72a4" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334732 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119de702-bd92-49d3-8bef-ba0fd81637c2" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334737 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="119de702-bd92-49d3-8bef-ba0fd81637c2" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334748 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-api" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334754 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-api" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334764 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df51999-222a-4ef1-a776-5b2c16270039" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334770 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df51999-222a-4ef1-a776-5b2c16270039" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334779 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-log" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334785 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-log" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334795 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a702e1-b24e-4d21-b56a-1e5ec5145565" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334803 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a702e1-b24e-4d21-b56a-1e5ec5145565" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334815 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23090877-6b52-4bf9-8272-0a3146fb5e70" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334821 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="23090877-6b52-4bf9-8272-0a3146fb5e70" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: E1216 07:19:43.334832 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-httpd" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.334838 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-httpd" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335002 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c508895-4490-426b-95d4-47b5a2e871e9" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335016 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-api" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335042 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df51999-222a-4ef1-a776-5b2c16270039" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335055 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a702e1-b24e-4d21-b56a-1e5ec5145565" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335061 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-httpd" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335067 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="119de702-bd92-49d3-8bef-ba0fd81637c2" containerName="mariadb-account-create-update" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335077 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" containerName="glance-log" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335091 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="23090877-6b52-4bf9-8272-0a3146fb5e70" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335101 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebefd0b6-7523-402f-8952-76a232986c74" containerName="neutron-httpd" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335110 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="02ea4a50-20c1-4954-8438-520ce44b72a4" containerName="mariadb-database-create" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.335959 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.339269 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.339338 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.346733 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402455 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmsn8\" (UniqueName: \"kubernetes.io/projected/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-kube-api-access-zmsn8\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402519 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402562 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-logs\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402628 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402657 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-config-data\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402683 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-scripts\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.402990 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.403072 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505392 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505505 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmsn8\" (UniqueName: \"kubernetes.io/projected/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-kube-api-access-zmsn8\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505544 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505589 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-logs\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505636 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505662 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-config-data\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505681 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-scripts\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.505739 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.506537 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-logs\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.506768 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.510156 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.510456 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.511554 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-config-data\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.511790 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.512658 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-scripts\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.524146 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmsn8\" (UniqueName: \"kubernetes.io/projected/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-kube-api-access-zmsn8\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.534581 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.663835 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:19:43 crc kubenswrapper[4823]: I1216 07:19:43.797414 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa3cd8b-aa5f-4769-a8aa-801716fa389c" path="/var/lib/kubelet/pods/efa3cd8b-aa5f-4769-a8aa-801716fa389c/volumes" Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.251927 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerStarted","Data":"65c720aa66198cd64db7ed4812aa6319f1cf4cd549bb47e90ab3755332db7d7c"} Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.252787 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.252117 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="sg-core" containerID="cri-o://dabbb904de8166a0f13fd44ee28a44dd660742c437cd4663b17be2496b0e5c52" gracePeriod=30 Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.252099 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-central-agent" containerID="cri-o://0554ea0464e696023a5a79ebbbad46e3a5e0f3f41b1854665b94758071ee63c6" gracePeriod=30 Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.252142 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="proxy-httpd" containerID="cri-o://65c720aa66198cd64db7ed4812aa6319f1cf4cd549bb47e90ab3755332db7d7c" gracePeriod=30 Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.252145 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-notification-agent" containerID="cri-o://0ce37b9e2d682bff4ed7a343afd322004978b019c106114a737cb0273299956f" gracePeriod=30 Dec 16 07:19:44 crc kubenswrapper[4823]: I1216 07:19:44.261706 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:19:44 crc kubenswrapper[4823]: W1216 07:19:44.273920 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbee1863_ef4e_4d0a_aca7_f7c09e3f0a50.slice/crio-eb855cdc74329a8ae82cfe1b766eadc8b371fcfc4c6784324b0352ca09302388 WatchSource:0}: Error finding container eb855cdc74329a8ae82cfe1b766eadc8b371fcfc4c6784324b0352ca09302388: Status 404 returned error can't find the container with id eb855cdc74329a8ae82cfe1b766eadc8b371fcfc4c6784324b0352ca09302388 Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.266837 4823 generic.go:334] "Generic (PLEG): container finished" podID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerID="65c720aa66198cd64db7ed4812aa6319f1cf4cd549bb47e90ab3755332db7d7c" exitCode=0 Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267395 4823 generic.go:334] "Generic (PLEG): container finished" podID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerID="dabbb904de8166a0f13fd44ee28a44dd660742c437cd4663b17be2496b0e5c52" exitCode=2 Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267409 4823 generic.go:334] "Generic (PLEG): container finished" podID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerID="0ce37b9e2d682bff4ed7a343afd322004978b019c106114a737cb0273299956f" exitCode=0 Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267417 4823 generic.go:334] "Generic (PLEG): container finished" podID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerID="0554ea0464e696023a5a79ebbbad46e3a5e0f3f41b1854665b94758071ee63c6" exitCode=0 Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267035 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerDied","Data":"65c720aa66198cd64db7ed4812aa6319f1cf4cd549bb47e90ab3755332db7d7c"} Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267538 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerDied","Data":"dabbb904de8166a0f13fd44ee28a44dd660742c437cd4663b17be2496b0e5c52"} Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267558 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerDied","Data":"0ce37b9e2d682bff4ed7a343afd322004978b019c106114a737cb0273299956f"} Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.267571 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerDied","Data":"0554ea0464e696023a5a79ebbbad46e3a5e0f3f41b1854665b94758071ee63c6"} Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.269197 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50","Type":"ContainerStarted","Data":"966c9a295917276f353f9e97ebb9a673f7628bec540b8b5ef3aef083889d35ba"} Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.269231 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50","Type":"ContainerStarted","Data":"eb855cdc74329a8ae82cfe1b766eadc8b371fcfc4c6784324b0352ca09302388"} Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.429673 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544563 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-sg-core-conf-yaml\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544609 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-log-httpd\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544638 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-combined-ca-bundle\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544711 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-config-data\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544768 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274dh\" (UniqueName: \"kubernetes.io/projected/27f18cac-1d41-44f1-b1b1-81cd65e8162a-kube-api-access-274dh\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544807 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-run-httpd\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.544974 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-scripts\") pod \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\" (UID: \"27f18cac-1d41-44f1-b1b1-81cd65e8162a\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.546168 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.546517 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.552510 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f18cac-1d41-44f1-b1b1-81cd65e8162a-kube-api-access-274dh" (OuterVolumeSpecName: "kube-api-access-274dh") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "kube-api-access-274dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.554243 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-scripts" (OuterVolumeSpecName: "scripts") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.593243 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.647318 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274dh\" (UniqueName: \"kubernetes.io/projected/27f18cac-1d41-44f1-b1b1-81cd65e8162a-kube-api-access-274dh\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.647357 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.647369 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.647380 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.647393 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f18cac-1d41-44f1-b1b1-81cd65e8162a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.659474 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.696264 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-config-data" (OuterVolumeSpecName: "config-data") pod "27f18cac-1d41-44f1-b1b1-81cd65e8162a" (UID: "27f18cac-1d41-44f1-b1b1-81cd65e8162a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.749771 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.749808 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f18cac-1d41-44f1-b1b1-81cd65e8162a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.864504 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.952821 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-config-data\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953343 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4ls\" (UniqueName: \"kubernetes.io/projected/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-kube-api-access-nz4ls\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953428 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-httpd-run\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953487 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-combined-ca-bundle\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953560 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-logs\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953636 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953688 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-internal-tls-certs\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.953730 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-scripts\") pod \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\" (UID: \"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80\") " Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.954556 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-logs" (OuterVolumeSpecName: "logs") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.957536 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.957971 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-kube-api-access-nz4ls" (OuterVolumeSpecName: "kube-api-access-nz4ls") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "kube-api-access-nz4ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.963591 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-scripts" (OuterVolumeSpecName: "scripts") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.965648 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:19:45 crc kubenswrapper[4823]: I1216 07:19:45.992492 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.022551 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.026444 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-config-data" (OuterVolumeSpecName: "config-data") pod "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" (UID: "f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056195 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz4ls\" (UniqueName: \"kubernetes.io/projected/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-kube-api-access-nz4ls\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056238 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056252 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056264 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056305 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056318 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056329 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.056339 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.076530 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.158118 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.278991 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50","Type":"ContainerStarted","Data":"9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71"} Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.280634 4823 generic.go:334] "Generic (PLEG): container finished" podID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerID="f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4" exitCode=0 Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.280665 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.280694 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80","Type":"ContainerDied","Data":"f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4"} Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.280719 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80","Type":"ContainerDied","Data":"97576eb822d67e73d6511817b900348abb68bbe0055fbd5f9f05edb9efa1d245"} Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.280735 4823 scope.go:117] "RemoveContainer" containerID="f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.283771 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f18cac-1d41-44f1-b1b1-81cd65e8162a","Type":"ContainerDied","Data":"ce3aeda0c823fa13b031e1d58c32c8239a4c0c36a73f16fdd4ca6544c4f6065b"} Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.283867 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.307682 4823 scope.go:117] "RemoveContainer" containerID="0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.313619 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.313598253 podStartE2EDuration="3.313598253s" podCreationTimestamp="2025-12-16 07:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:46.301222105 +0000 UTC m=+1464.789788228" watchObservedRunningTime="2025-12-16 07:19:46.313598253 +0000 UTC m=+1464.802164376" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.328897 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.338924 4823 scope.go:117] "RemoveContainer" containerID="f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.339460 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4\": container with ID starting with f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4 not found: ID does not exist" containerID="f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.339505 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4"} err="failed to get container status \"f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4\": rpc error: code = NotFound desc = could not find container \"f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4\": container with ID starting with f7e6a59d65cbff5bad0fdcd293a529173be048f1508f81f77dc92df6f821abe4 not found: ID does not exist" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.339528 4823 scope.go:117] "RemoveContainer" containerID="0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.341108 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4\": container with ID starting with 0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4 not found: ID does not exist" containerID="0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.341198 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4"} err="failed to get container status \"0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4\": rpc error: code = NotFound desc = could not find container \"0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4\": container with ID starting with 0f3ee8c3dad0e8e1137f41d14df6304cc4ea56017bfa0aa42257759226420db4 not found: ID does not exist" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.341225 4823 scope.go:117] "RemoveContainer" containerID="65c720aa66198cd64db7ed4812aa6319f1cf4cd549bb47e90ab3755332db7d7c" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.374663 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.382574 4823 scope.go:117] "RemoveContainer" containerID="dabbb904de8166a0f13fd44ee28a44dd660742c437cd4663b17be2496b0e5c52" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.402316 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.418471 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.426724 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.427179 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-log" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427208 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-log" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.427234 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="sg-core" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427241 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="sg-core" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.427264 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-central-agent" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427278 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-central-agent" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.427289 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-notification-agent" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427295 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-notification-agent" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.427307 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="proxy-httpd" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427312 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="proxy-httpd" Dec 16 07:19:46 crc kubenswrapper[4823]: E1216 07:19:46.427330 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-httpd" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427338 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-httpd" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427496 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-log" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427512 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="sg-core" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427522 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" containerName="glance-httpd" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427536 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="proxy-httpd" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427546 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-central-agent" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.427556 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" containerName="ceilometer-notification-agent" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.428499 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.431137 4823 scope.go:117] "RemoveContainer" containerID="0ce37b9e2d682bff4ed7a343afd322004978b019c106114a737cb0273299956f" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.432040 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.437676 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.439139 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.441605 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.445677 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.445835 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.450014 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.461339 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.470047 4823 scope.go:117] "RemoveContainer" containerID="0554ea0464e696023a5a79ebbbad46e3a5e0f3f41b1854665b94758071ee63c6" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.571790 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.571870 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-log-httpd\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.571897 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.571945 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.571974 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.571999 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572059 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb6nm\" (UniqueName: \"kubernetes.io/projected/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-kube-api-access-fb6nm\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572123 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-config-data\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572156 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572186 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572217 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-run-httpd\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572251 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572279 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-scripts\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572301 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.572340 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn4rv\" (UniqueName: \"kubernetes.io/projected/9ed72123-2696-4dbd-b7bd-7509458dfaa0-kube-api-access-sn4rv\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674193 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-config-data\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674246 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674270 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674293 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-run-httpd\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674306 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674324 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-scripts\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674339 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674376 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn4rv\" (UniqueName: \"kubernetes.io/projected/9ed72123-2696-4dbd-b7bd-7509458dfaa0-kube-api-access-sn4rv\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674425 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-log-httpd\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674467 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674498 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674519 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674538 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.674564 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb6nm\" (UniqueName: \"kubernetes.io/projected/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-kube-api-access-fb6nm\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.676107 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.676318 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.676386 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-run-httpd\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.676503 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.676719 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-log-httpd\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.684189 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.692242 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.693068 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.695560 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.695840 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-scripts\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.696973 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.697256 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn4rv\" (UniqueName: \"kubernetes.io/projected/9ed72123-2696-4dbd-b7bd-7509458dfaa0-kube-api-access-sn4rv\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.698249 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-config-data\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.699091 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb6nm\" (UniqueName: \"kubernetes.io/projected/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-kube-api-access-fb6nm\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.704305 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " pod="openstack/ceilometer-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.712691 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.782858 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:46 crc kubenswrapper[4823]: I1216 07:19:46.801038 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:19:47 crc kubenswrapper[4823]: I1216 07:19:47.307699 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:19:47 crc kubenswrapper[4823]: W1216 07:19:47.327210 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ed72123_2696_4dbd_b7bd_7509458dfaa0.slice/crio-104ca67bffb3be737b157021b8fa214e543d9e1deb897ef7db3c34fc8c54309d WatchSource:0}: Error finding container 104ca67bffb3be737b157021b8fa214e543d9e1deb897ef7db3c34fc8c54309d: Status 404 returned error can't find the container with id 104ca67bffb3be737b157021b8fa214e543d9e1deb897ef7db3c34fc8c54309d Dec 16 07:19:47 crc kubenswrapper[4823]: W1216 07:19:47.397584 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc7377ca_c7ab_4ee0_ae2a_5fbb782ba925.slice/crio-1aad62c97e347c1cb323d949097e8cf2b4fd9c0df9bffef7fcb5c5eb54fa2f65 WatchSource:0}: Error finding container 1aad62c97e347c1cb323d949097e8cf2b4fd9c0df9bffef7fcb5c5eb54fa2f65: Status 404 returned error can't find the container with id 1aad62c97e347c1cb323d949097e8cf2b4fd9c0df9bffef7fcb5c5eb54fa2f65 Dec 16 07:19:47 crc kubenswrapper[4823]: I1216 07:19:47.397856 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:19:47 crc kubenswrapper[4823]: I1216 07:19:47.788326 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f18cac-1d41-44f1-b1b1-81cd65e8162a" path="/var/lib/kubelet/pods/27f18cac-1d41-44f1-b1b1-81cd65e8162a/volumes" Dec 16 07:19:47 crc kubenswrapper[4823]: I1216 07:19:47.789537 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80" path="/var/lib/kubelet/pods/f85fada8-dce0-4e2b-83f5-ebf4f6fe2c80/volumes" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.311454 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerStarted","Data":"ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b"} Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.311808 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerStarted","Data":"104ca67bffb3be737b157021b8fa214e543d9e1deb897ef7db3c34fc8c54309d"} Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.312890 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925","Type":"ContainerStarted","Data":"3ce6c26d6258938fda89230a518530d00595939cb83c8d60892c9449174748b0"} Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.312936 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925","Type":"ContainerStarted","Data":"1aad62c97e347c1cb323d949097e8cf2b4fd9c0df9bffef7fcb5c5eb54fa2f65"} Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.490699 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qm72p"] Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.491983 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.494081 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.495776 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.495945 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7vhm5" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.521672 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qm72p"] Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.614789 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-config-data\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.616902 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj7qq\" (UniqueName: \"kubernetes.io/projected/e0d16c10-6c99-4b18-b515-4a9c18c830b5-kube-api-access-cj7qq\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.617257 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-scripts\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.617470 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.733401 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.733611 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-config-data\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.733720 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj7qq\" (UniqueName: \"kubernetes.io/projected/e0d16c10-6c99-4b18-b515-4a9c18c830b5-kube-api-access-cj7qq\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.733916 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-scripts\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.741979 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-scripts\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.744602 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-config-data\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.753056 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.759250 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj7qq\" (UniqueName: \"kubernetes.io/projected/e0d16c10-6c99-4b18-b515-4a9c18c830b5-kube-api-access-cj7qq\") pod \"nova-cell0-conductor-db-sync-qm72p\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:48 crc kubenswrapper[4823]: I1216 07:19:48.830796 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:19:49 crc kubenswrapper[4823]: I1216 07:19:49.323065 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerStarted","Data":"cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b"} Dec 16 07:19:49 crc kubenswrapper[4823]: I1216 07:19:49.325048 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925","Type":"ContainerStarted","Data":"a3797feae0da2f46b99e7827ab8d4f11114590dcdde7cc7247a8b58f538e9505"} Dec 16 07:19:49 crc kubenswrapper[4823]: I1216 07:19:49.362146 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qm72p"] Dec 16 07:19:49 crc kubenswrapper[4823]: I1216 07:19:49.365620 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.365599201 podStartE2EDuration="3.365599201s" podCreationTimestamp="2025-12-16 07:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:19:49.345900454 +0000 UTC m=+1467.834466577" watchObservedRunningTime="2025-12-16 07:19:49.365599201 +0000 UTC m=+1467.854165324" Dec 16 07:19:50 crc kubenswrapper[4823]: I1216 07:19:50.337552 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerStarted","Data":"db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6"} Dec 16 07:19:50 crc kubenswrapper[4823]: I1216 07:19:50.339012 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qm72p" event={"ID":"e0d16c10-6c99-4b18-b515-4a9c18c830b5","Type":"ContainerStarted","Data":"85c043aa9b80f87aa942636042adc26ac770f0b4310d96d228a70b98e2c1075f"} Dec 16 07:19:50 crc kubenswrapper[4823]: I1216 07:19:50.721117 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 07:19:51 crc kubenswrapper[4823]: I1216 07:19:51.355950 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerStarted","Data":"2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba"} Dec 16 07:19:51 crc kubenswrapper[4823]: I1216 07:19:51.356344 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:19:51 crc kubenswrapper[4823]: I1216 07:19:51.383087 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.745957346 podStartE2EDuration="5.3830682s" podCreationTimestamp="2025-12-16 07:19:46 +0000 UTC" firstStartedPulling="2025-12-16 07:19:47.329970566 +0000 UTC m=+1465.818536689" lastFinishedPulling="2025-12-16 07:19:50.96708142 +0000 UTC m=+1469.455647543" observedRunningTime="2025-12-16 07:19:51.381765238 +0000 UTC m=+1469.870331381" watchObservedRunningTime="2025-12-16 07:19:51.3830682 +0000 UTC m=+1469.871634333" Dec 16 07:19:53 crc kubenswrapper[4823]: I1216 07:19:53.665010 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:19:53 crc kubenswrapper[4823]: I1216 07:19:53.667643 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:19:53 crc kubenswrapper[4823]: I1216 07:19:53.704632 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:19:53 crc kubenswrapper[4823]: I1216 07:19:53.715554 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:19:54 crc kubenswrapper[4823]: I1216 07:19:54.393921 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:19:54 crc kubenswrapper[4823]: I1216 07:19:54.394005 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.514823 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.515258 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.518439 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.784806 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.784854 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.824436 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:56 crc kubenswrapper[4823]: I1216 07:19:56.838428 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:57 crc kubenswrapper[4823]: I1216 07:19:57.421988 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:57 crc kubenswrapper[4823]: I1216 07:19:57.422375 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:58 crc kubenswrapper[4823]: I1216 07:19:58.134235 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:19:58 crc kubenswrapper[4823]: I1216 07:19:58.134302 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:19:58 crc kubenswrapper[4823]: I1216 07:19:58.432784 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qm72p" event={"ID":"e0d16c10-6c99-4b18-b515-4a9c18c830b5","Type":"ContainerStarted","Data":"c415931a0be8201cf5c9581bbc5fd4c823fe4b93ff209f48cd059567ea181a32"} Dec 16 07:19:58 crc kubenswrapper[4823]: I1216 07:19:58.456862 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qm72p" podStartSLOduration=2.36620563 podStartE2EDuration="10.456845541s" podCreationTimestamp="2025-12-16 07:19:48 +0000 UTC" firstStartedPulling="2025-12-16 07:19:49.3556534 +0000 UTC m=+1467.844219523" lastFinishedPulling="2025-12-16 07:19:57.446293311 +0000 UTC m=+1475.934859434" observedRunningTime="2025-12-16 07:19:58.450262645 +0000 UTC m=+1476.938828768" watchObservedRunningTime="2025-12-16 07:19:58.456845541 +0000 UTC m=+1476.945411664" Dec 16 07:19:59 crc kubenswrapper[4823]: I1216 07:19:59.521572 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:19:59 crc kubenswrapper[4823]: I1216 07:19:59.521946 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:19:59 crc kubenswrapper[4823]: I1216 07:19:59.525432 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:20:01 crc kubenswrapper[4823]: I1216 07:20:01.537359 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:01 crc kubenswrapper[4823]: I1216 07:20:01.537980 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-central-agent" containerID="cri-o://ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b" gracePeriod=30 Dec 16 07:20:01 crc kubenswrapper[4823]: I1216 07:20:01.538126 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-notification-agent" containerID="cri-o://cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b" gracePeriod=30 Dec 16 07:20:01 crc kubenswrapper[4823]: I1216 07:20:01.538142 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="sg-core" containerID="cri-o://db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6" gracePeriod=30 Dec 16 07:20:01 crc kubenswrapper[4823]: I1216 07:20:01.538361 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="proxy-httpd" containerID="cri-o://2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba" gracePeriod=30 Dec 16 07:20:01 crc kubenswrapper[4823]: I1216 07:20:01.580798 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.172:3000/\": EOF" Dec 16 07:20:02 crc kubenswrapper[4823]: I1216 07:20:02.467566 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerID="2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba" exitCode=0 Dec 16 07:20:02 crc kubenswrapper[4823]: I1216 07:20:02.467941 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerID="db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6" exitCode=2 Dec 16 07:20:02 crc kubenswrapper[4823]: I1216 07:20:02.467963 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerID="ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b" exitCode=0 Dec 16 07:20:02 crc kubenswrapper[4823]: I1216 07:20:02.467768 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerDied","Data":"2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba"} Dec 16 07:20:02 crc kubenswrapper[4823]: I1216 07:20:02.468130 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerDied","Data":"db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6"} Dec 16 07:20:02 crc kubenswrapper[4823]: I1216 07:20:02.468162 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerDied","Data":"ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b"} Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.125730 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5855z"] Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.128321 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.136476 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5855z"] Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.277078 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxjs9\" (UniqueName: \"kubernetes.io/projected/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-kube-api-access-hxjs9\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.277682 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-utilities\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.277738 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-catalog-content\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.379316 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxjs9\" (UniqueName: \"kubernetes.io/projected/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-kube-api-access-hxjs9\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.379467 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-utilities\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.379492 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-catalog-content\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.379970 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-utilities\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.380014 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-catalog-content\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.402224 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxjs9\" (UniqueName: \"kubernetes.io/projected/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-kube-api-access-hxjs9\") pod \"certified-operators-5855z\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:06 crc kubenswrapper[4823]: I1216 07:20:06.448880 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.072330 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.099986 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-sg-core-conf-yaml\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.100050 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-combined-ca-bundle\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.100093 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-scripts\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.100264 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn4rv\" (UniqueName: \"kubernetes.io/projected/9ed72123-2696-4dbd-b7bd-7509458dfaa0-kube-api-access-sn4rv\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.100325 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-config-data\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.100392 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-run-httpd\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.100420 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-log-httpd\") pod \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\" (UID: \"9ed72123-2696-4dbd-b7bd-7509458dfaa0\") " Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.101318 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.102065 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.128290 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-scripts" (OuterVolumeSpecName: "scripts") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.128446 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed72123-2696-4dbd-b7bd-7509458dfaa0-kube-api-access-sn4rv" (OuterVolumeSpecName: "kube-api-access-sn4rv") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "kube-api-access-sn4rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.132989 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.200177 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.202171 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.202201 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ed72123-2696-4dbd-b7bd-7509458dfaa0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.202210 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.202218 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.202227 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.202235 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn4rv\" (UniqueName: \"kubernetes.io/projected/9ed72123-2696-4dbd-b7bd-7509458dfaa0-kube-api-access-sn4rv\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.249923 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-config-data" (OuterVolumeSpecName: "config-data") pod "9ed72123-2696-4dbd-b7bd-7509458dfaa0" (UID: "9ed72123-2696-4dbd-b7bd-7509458dfaa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.303517 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ed72123-2696-4dbd-b7bd-7509458dfaa0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.394300 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5855z"] Dec 16 07:20:07 crc kubenswrapper[4823]: W1216 07:20:07.395112 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fa50495_02f1_4a8d_a65a_8f28ebedaa92.slice/crio-af60c4150f4400ea9af4ffcfe91fa37e307504200a4d944d5dc2b8072c44dd0e WatchSource:0}: Error finding container af60c4150f4400ea9af4ffcfe91fa37e307504200a4d944d5dc2b8072c44dd0e: Status 404 returned error can't find the container with id af60c4150f4400ea9af4ffcfe91fa37e307504200a4d944d5dc2b8072c44dd0e Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.518945 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerStarted","Data":"af60c4150f4400ea9af4ffcfe91fa37e307504200a4d944d5dc2b8072c44dd0e"} Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.521736 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerID="cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b" exitCode=0 Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.521772 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerDied","Data":"cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b"} Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.521802 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9ed72123-2696-4dbd-b7bd-7509458dfaa0","Type":"ContainerDied","Data":"104ca67bffb3be737b157021b8fa214e543d9e1deb897ef7db3c34fc8c54309d"} Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.521819 4823 scope.go:117] "RemoveContainer" containerID="2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.521817 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.552655 4823 scope.go:117] "RemoveContainer" containerID="db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.579844 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.582517 4823 scope.go:117] "RemoveContainer" containerID="cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.582647 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597330 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.597687 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-central-agent" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597703 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-central-agent" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.597710 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-notification-agent" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597717 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-notification-agent" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.597728 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="sg-core" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597735 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="sg-core" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.597742 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="proxy-httpd" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597748 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="proxy-httpd" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597922 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="proxy-httpd" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597940 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-notification-agent" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597951 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="ceilometer-central-agent" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.597965 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" containerName="sg-core" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.599438 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.601531 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.602844 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.608999 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609073 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609108 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-log-httpd\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609229 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bxzx\" (UniqueName: \"kubernetes.io/projected/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-kube-api-access-7bxzx\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609264 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609311 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-scripts\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609332 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-config-data\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.609386 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-run-httpd\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.634208 4823 scope.go:117] "RemoveContainer" containerID="ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.658266 4823 scope.go:117] "RemoveContainer" containerID="2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.659821 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba\": container with ID starting with 2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba not found: ID does not exist" containerID="2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.659878 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba"} err="failed to get container status \"2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba\": rpc error: code = NotFound desc = could not find container \"2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba\": container with ID starting with 2f65dbae1908971e62c0b5183daa807f87eb8a86a23c4a86bf77dbf243fcc6ba not found: ID does not exist" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.659908 4823 scope.go:117] "RemoveContainer" containerID="db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.660295 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6\": container with ID starting with db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6 not found: ID does not exist" containerID="db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.660323 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6"} err="failed to get container status \"db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6\": rpc error: code = NotFound desc = could not find container \"db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6\": container with ID starting with db7040f6856e1d319563364284196e734bc8ef160feda3501df3bfe7793745b6 not found: ID does not exist" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.660339 4823 scope.go:117] "RemoveContainer" containerID="cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.660644 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b\": container with ID starting with cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b not found: ID does not exist" containerID="cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.660671 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b"} err="failed to get container status \"cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b\": rpc error: code = NotFound desc = could not find container \"cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b\": container with ID starting with cc997320328ecd0b5433bb9c9d52e6bd6bd5a69b2afaec04e2122e2788aaa14b not found: ID does not exist" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.660689 4823 scope.go:117] "RemoveContainer" containerID="ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b" Dec 16 07:20:07 crc kubenswrapper[4823]: E1216 07:20:07.660987 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b\": container with ID starting with ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b not found: ID does not exist" containerID="ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.661014 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b"} err="failed to get container status \"ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b\": rpc error: code = NotFound desc = could not find container \"ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b\": container with ID starting with ecb083e34591e17d3bff711b2f12bbfbbbdd25c03157db0c53fb26956b8e446b not found: ID does not exist" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.710864 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.710924 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-log-httpd\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.710972 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bxzx\" (UniqueName: \"kubernetes.io/projected/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-kube-api-access-7bxzx\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.711006 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.711057 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-scripts\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.711078 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-config-data\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.711115 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-run-httpd\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.711476 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-log-httpd\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.711562 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-run-httpd\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.716588 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-config-data\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.717137 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-scripts\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.717648 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.718711 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.729526 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bxzx\" (UniqueName: \"kubernetes.io/projected/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-kube-api-access-7bxzx\") pod \"ceilometer-0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " pod="openstack/ceilometer-0" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.785986 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed72123-2696-4dbd-b7bd-7509458dfaa0" path="/var/lib/kubelet/pods/9ed72123-2696-4dbd-b7bd-7509458dfaa0/volumes" Dec 16 07:20:07 crc kubenswrapper[4823]: I1216 07:20:07.925145 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:08 crc kubenswrapper[4823]: I1216 07:20:08.378403 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:08 crc kubenswrapper[4823]: W1216 07:20:08.393309 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29578ac7_f6f2_4a0d_8fab_39f39e55acc0.slice/crio-e3873de515a2cda4cf14db445c24855470a2a8dc8e4db320eca90123d6f878eb WatchSource:0}: Error finding container e3873de515a2cda4cf14db445c24855470a2a8dc8e4db320eca90123d6f878eb: Status 404 returned error can't find the container with id e3873de515a2cda4cf14db445c24855470a2a8dc8e4db320eca90123d6f878eb Dec 16 07:20:08 crc kubenswrapper[4823]: I1216 07:20:08.397405 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:20:08 crc kubenswrapper[4823]: I1216 07:20:08.531841 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerStarted","Data":"e3873de515a2cda4cf14db445c24855470a2a8dc8e4db320eca90123d6f878eb"} Dec 16 07:20:08 crc kubenswrapper[4823]: I1216 07:20:08.533941 4823 generic.go:334] "Generic (PLEG): container finished" podID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerID="5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b" exitCode=0 Dec 16 07:20:08 crc kubenswrapper[4823]: I1216 07:20:08.533989 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerDied","Data":"5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b"} Dec 16 07:20:09 crc kubenswrapper[4823]: I1216 07:20:09.545340 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerStarted","Data":"c955ce6041571e455d47f11d7672f330969cd7bee20a2ebc46c3b1b01bea3299"} Dec 16 07:20:09 crc kubenswrapper[4823]: I1216 07:20:09.547663 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerStarted","Data":"a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470"} Dec 16 07:20:10 crc kubenswrapper[4823]: I1216 07:20:10.255069 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:10 crc kubenswrapper[4823]: I1216 07:20:10.576986 4823 generic.go:334] "Generic (PLEG): container finished" podID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerID="a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470" exitCode=0 Dec 16 07:20:10 crc kubenswrapper[4823]: I1216 07:20:10.577056 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerDied","Data":"a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470"} Dec 16 07:20:12 crc kubenswrapper[4823]: I1216 07:20:12.604385 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerStarted","Data":"b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e"} Dec 16 07:20:12 crc kubenswrapper[4823]: I1216 07:20:12.606139 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerStarted","Data":"b16f86679c997d30b6aca6f2aa5df76a1880c68b3ec769c521c8945a324a11a5"} Dec 16 07:20:12 crc kubenswrapper[4823]: I1216 07:20:12.643500 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5855z" podStartSLOduration=3.308792447 podStartE2EDuration="6.64348126s" podCreationTimestamp="2025-12-16 07:20:06 +0000 UTC" firstStartedPulling="2025-12-16 07:20:08.536058515 +0000 UTC m=+1487.024624638" lastFinishedPulling="2025-12-16 07:20:11.870747328 +0000 UTC m=+1490.359313451" observedRunningTime="2025-12-16 07:20:12.637044459 +0000 UTC m=+1491.125610582" watchObservedRunningTime="2025-12-16 07:20:12.64348126 +0000 UTC m=+1491.132047383" Dec 16 07:20:13 crc kubenswrapper[4823]: I1216 07:20:13.619230 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerStarted","Data":"f010755d9b8b357e79bfe7052e92e23ba3c179146d138ec71e21d1b8bb9eeda4"} Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.640070 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerStarted","Data":"87627a22be8888cde133c7d64d4247cda38703e6fd8c15eccc434f6eddc8a8e6"} Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.640646 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.640256 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="proxy-httpd" containerID="cri-o://87627a22be8888cde133c7d64d4247cda38703e6fd8c15eccc434f6eddc8a8e6" gracePeriod=30 Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.640209 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-central-agent" containerID="cri-o://c955ce6041571e455d47f11d7672f330969cd7bee20a2ebc46c3b1b01bea3299" gracePeriod=30 Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.640331 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-notification-agent" containerID="cri-o://b16f86679c997d30b6aca6f2aa5df76a1880c68b3ec769c521c8945a324a11a5" gracePeriod=30 Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.640290 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="sg-core" containerID="cri-o://f010755d9b8b357e79bfe7052e92e23ba3c179146d138ec71e21d1b8bb9eeda4" gracePeriod=30 Dec 16 07:20:15 crc kubenswrapper[4823]: I1216 07:20:15.676471 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.580119234 podStartE2EDuration="8.676446193s" podCreationTimestamp="2025-12-16 07:20:07 +0000 UTC" firstStartedPulling="2025-12-16 07:20:08.397103232 +0000 UTC m=+1486.885669355" lastFinishedPulling="2025-12-16 07:20:14.493430191 +0000 UTC m=+1492.981996314" observedRunningTime="2025-12-16 07:20:15.659820582 +0000 UTC m=+1494.148386705" watchObservedRunningTime="2025-12-16 07:20:15.676446193 +0000 UTC m=+1494.165012316" Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.449553 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.449865 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.652257 4823 generic.go:334] "Generic (PLEG): container finished" podID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerID="87627a22be8888cde133c7d64d4247cda38703e6fd8c15eccc434f6eddc8a8e6" exitCode=0 Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.652313 4823 generic.go:334] "Generic (PLEG): container finished" podID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerID="f010755d9b8b357e79bfe7052e92e23ba3c179146d138ec71e21d1b8bb9eeda4" exitCode=2 Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.652322 4823 generic.go:334] "Generic (PLEG): container finished" podID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerID="b16f86679c997d30b6aca6f2aa5df76a1880c68b3ec769c521c8945a324a11a5" exitCode=0 Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.652338 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerDied","Data":"87627a22be8888cde133c7d64d4247cda38703e6fd8c15eccc434f6eddc8a8e6"} Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.652377 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerDied","Data":"f010755d9b8b357e79bfe7052e92e23ba3c179146d138ec71e21d1b8bb9eeda4"} Dec 16 07:20:16 crc kubenswrapper[4823]: I1216 07:20:16.652397 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerDied","Data":"b16f86679c997d30b6aca6f2aa5df76a1880c68b3ec769c521c8945a324a11a5"} Dec 16 07:20:17 crc kubenswrapper[4823]: I1216 07:20:17.499596 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5855z" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="registry-server" probeResult="failure" output=< Dec 16 07:20:17 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 07:20:17 crc kubenswrapper[4823]: > Dec 16 07:20:17 crc kubenswrapper[4823]: I1216 07:20:17.661293 4823 generic.go:334] "Generic (PLEG): container finished" podID="e0d16c10-6c99-4b18-b515-4a9c18c830b5" containerID="c415931a0be8201cf5c9581bbc5fd4c823fe4b93ff209f48cd059567ea181a32" exitCode=0 Dec 16 07:20:17 crc kubenswrapper[4823]: I1216 07:20:17.661337 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qm72p" event={"ID":"e0d16c10-6c99-4b18-b515-4a9c18c830b5","Type":"ContainerDied","Data":"c415931a0be8201cf5c9581bbc5fd4c823fe4b93ff209f48cd059567ea181a32"} Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.112181 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.221938 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-scripts\") pod \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.222348 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-config-data\") pod \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.222390 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-combined-ca-bundle\") pod \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.222560 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj7qq\" (UniqueName: \"kubernetes.io/projected/e0d16c10-6c99-4b18-b515-4a9c18c830b5-kube-api-access-cj7qq\") pod \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\" (UID: \"e0d16c10-6c99-4b18-b515-4a9c18c830b5\") " Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.228239 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-scripts" (OuterVolumeSpecName: "scripts") pod "e0d16c10-6c99-4b18-b515-4a9c18c830b5" (UID: "e0d16c10-6c99-4b18-b515-4a9c18c830b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.229614 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d16c10-6c99-4b18-b515-4a9c18c830b5-kube-api-access-cj7qq" (OuterVolumeSpecName: "kube-api-access-cj7qq") pod "e0d16c10-6c99-4b18-b515-4a9c18c830b5" (UID: "e0d16c10-6c99-4b18-b515-4a9c18c830b5"). InnerVolumeSpecName "kube-api-access-cj7qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.249531 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-config-data" (OuterVolumeSpecName: "config-data") pod "e0d16c10-6c99-4b18-b515-4a9c18c830b5" (UID: "e0d16c10-6c99-4b18-b515-4a9c18c830b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.260520 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0d16c10-6c99-4b18-b515-4a9c18c830b5" (UID: "e0d16c10-6c99-4b18-b515-4a9c18c830b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.324868 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj7qq\" (UniqueName: \"kubernetes.io/projected/e0d16c10-6c99-4b18-b515-4a9c18c830b5-kube-api-access-cj7qq\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.324904 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.324914 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.324925 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d16c10-6c99-4b18-b515-4a9c18c830b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.700387 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qm72p" event={"ID":"e0d16c10-6c99-4b18-b515-4a9c18c830b5","Type":"ContainerDied","Data":"85c043aa9b80f87aa942636042adc26ac770f0b4310d96d228a70b98e2c1075f"} Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.700431 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c043aa9b80f87aa942636042adc26ac770f0b4310d96d228a70b98e2c1075f" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.700460 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qm72p" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.791797 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:20:19 crc kubenswrapper[4823]: E1216 07:20:19.792302 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d16c10-6c99-4b18-b515-4a9c18c830b5" containerName="nova-cell0-conductor-db-sync" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.792323 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d16c10-6c99-4b18-b515-4a9c18c830b5" containerName="nova-cell0-conductor-db-sync" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.792550 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d16c10-6c99-4b18-b515-4a9c18c830b5" containerName="nova-cell0-conductor-db-sync" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.793310 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.794977 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.796299 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7vhm5" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.819003 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.936830 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.936924 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:19 crc kubenswrapper[4823]: I1216 07:20:19.937090 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvqlz\" (UniqueName: \"kubernetes.io/projected/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-kube-api-access-bvqlz\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.038702 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.038760 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.038886 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvqlz\" (UniqueName: \"kubernetes.io/projected/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-kube-api-access-bvqlz\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.042952 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.044645 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.060773 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvqlz\" (UniqueName: \"kubernetes.io/projected/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-kube-api-access-bvqlz\") pod \"nova-cell0-conductor-0\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.111330 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.578515 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.710389 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ad8e2a2-14c6-45b5-86f3-e4765cddd777","Type":"ContainerStarted","Data":"1adc30d428b5b9d7595ca81cc6e015f7c66aa26f79ddf2c7abdc5fafc704a8df"} Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.714251 4823 generic.go:334] "Generic (PLEG): container finished" podID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerID="c955ce6041571e455d47f11d7672f330969cd7bee20a2ebc46c3b1b01bea3299" exitCode=0 Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.714305 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerDied","Data":"c955ce6041571e455d47f11d7672f330969cd7bee20a2ebc46c3b1b01bea3299"} Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.795664 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.959728 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-run-httpd\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.959863 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-scripts\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.959909 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bxzx\" (UniqueName: \"kubernetes.io/projected/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-kube-api-access-7bxzx\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.959944 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-sg-core-conf-yaml\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.960060 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-combined-ca-bundle\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.960128 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-config-data\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.960178 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-log-httpd\") pod \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\" (UID: \"29578ac7-f6f2-4a0d-8fab-39f39e55acc0\") " Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.960499 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.961111 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.961244 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.965587 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-kube-api-access-7bxzx" (OuterVolumeSpecName: "kube-api-access-7bxzx") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "kube-api-access-7bxzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.965661 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-scripts" (OuterVolumeSpecName: "scripts") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:20 crc kubenswrapper[4823]: I1216 07:20:20.998077 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.044852 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.065226 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.065273 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.065292 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bxzx\" (UniqueName: \"kubernetes.io/projected/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-kube-api-access-7bxzx\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.065311 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.065322 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.101225 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-config-data" (OuterVolumeSpecName: "config-data") pod "29578ac7-f6f2-4a0d-8fab-39f39e55acc0" (UID: "29578ac7-f6f2-4a0d-8fab-39f39e55acc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.167127 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29578ac7-f6f2-4a0d-8fab-39f39e55acc0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.734735 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ad8e2a2-14c6-45b5-86f3-e4765cddd777","Type":"ContainerStarted","Data":"51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9"} Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.734863 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.739950 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29578ac7-f6f2-4a0d-8fab-39f39e55acc0","Type":"ContainerDied","Data":"e3873de515a2cda4cf14db445c24855470a2a8dc8e4db320eca90123d6f878eb"} Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.740302 4823 scope.go:117] "RemoveContainer" containerID="87627a22be8888cde133c7d64d4247cda38703e6fd8c15eccc434f6eddc8a8e6" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.740145 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.769705 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.769679864 podStartE2EDuration="2.769679864s" podCreationTimestamp="2025-12-16 07:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:21.752535997 +0000 UTC m=+1500.241102140" watchObservedRunningTime="2025-12-16 07:20:21.769679864 +0000 UTC m=+1500.258245997" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.779763 4823 scope.go:117] "RemoveContainer" containerID="f010755d9b8b357e79bfe7052e92e23ba3c179146d138ec71e21d1b8bb9eeda4" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.852501 4823 scope.go:117] "RemoveContainer" containerID="b16f86679c997d30b6aca6f2aa5df76a1880c68b3ec769c521c8945a324a11a5" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.865424 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.877932 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.895594 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:21 crc kubenswrapper[4823]: E1216 07:20:21.896041 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-central-agent" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896062 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-central-agent" Dec 16 07:20:21 crc kubenswrapper[4823]: E1216 07:20:21.896089 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-notification-agent" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896098 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-notification-agent" Dec 16 07:20:21 crc kubenswrapper[4823]: E1216 07:20:21.896117 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="sg-core" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896125 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="sg-core" Dec 16 07:20:21 crc kubenswrapper[4823]: E1216 07:20:21.896137 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="proxy-httpd" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896143 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="proxy-httpd" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896306 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-notification-agent" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896320 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="sg-core" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896332 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="proxy-httpd" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.896340 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" containerName="ceilometer-central-agent" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.897756 4823 scope.go:117] "RemoveContainer" containerID="c955ce6041571e455d47f11d7672f330969cd7bee20a2ebc46c3b1b01bea3299" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.898148 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.902164 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.902318 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:20:21 crc kubenswrapper[4823]: I1216 07:20:21.903401 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.000615 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jqb\" (UniqueName: \"kubernetes.io/projected/41b2b49f-acfe-4019-a983-c9cea9de4378-kube-api-access-r6jqb\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.000725 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-run-httpd\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.000851 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-log-httpd\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.000981 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.001171 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-config-data\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.001253 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-scripts\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.001451 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102628 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jqb\" (UniqueName: \"kubernetes.io/projected/41b2b49f-acfe-4019-a983-c9cea9de4378-kube-api-access-r6jqb\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102705 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-run-httpd\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102770 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-log-httpd\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102823 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102885 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-config-data\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102912 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-scripts\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.102949 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.103631 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-log-httpd\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.103892 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-run-httpd\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.108310 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-scripts\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.109702 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.118624 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-config-data\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.120562 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.122760 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jqb\" (UniqueName: \"kubernetes.io/projected/41b2b49f-acfe-4019-a983-c9cea9de4378-kube-api-access-r6jqb\") pod \"ceilometer-0\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.243458 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.707086 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:22 crc kubenswrapper[4823]: W1216 07:20:22.713830 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41b2b49f_acfe_4019_a983_c9cea9de4378.slice/crio-2f193c658c700f19047e1e8a00dff6ea721e5995a14485b02f0e9deffbe95aec WatchSource:0}: Error finding container 2f193c658c700f19047e1e8a00dff6ea721e5995a14485b02f0e9deffbe95aec: Status 404 returned error can't find the container with id 2f193c658c700f19047e1e8a00dff6ea721e5995a14485b02f0e9deffbe95aec Dec 16 07:20:22 crc kubenswrapper[4823]: I1216 07:20:22.755814 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerStarted","Data":"2f193c658c700f19047e1e8a00dff6ea721e5995a14485b02f0e9deffbe95aec"} Dec 16 07:20:23 crc kubenswrapper[4823]: I1216 07:20:23.785847 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29578ac7-f6f2-4a0d-8fab-39f39e55acc0" path="/var/lib/kubelet/pods/29578ac7-f6f2-4a0d-8fab-39f39e55acc0/volumes" Dec 16 07:20:23 crc kubenswrapper[4823]: I1216 07:20:23.787043 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerStarted","Data":"05a5b9f8956ec4b6d4cff55811bfb65a933e6db7a0295b8f7c4bc94832dccb2b"} Dec 16 07:20:24 crc kubenswrapper[4823]: I1216 07:20:24.783938 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerStarted","Data":"dc3495df65d03d20434814e4df5e7c4ba019d019526109dd9bc388dad5bacca8"} Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.139694 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.722746 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fzmlg"] Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.724373 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.726767 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.727160 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.737877 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzmlg"] Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.780201 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgphd\" (UniqueName: \"kubernetes.io/projected/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-kube-api-access-cgphd\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.780341 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-scripts\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.780423 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.780461 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-config-data\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.827857 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerStarted","Data":"12cdd53f57280c98da4f4f1685a71705c5a0d3d19818af557a48834817456ea8"} Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.871093 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.872860 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.875303 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.881553 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-config-data\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.881662 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgphd\" (UniqueName: \"kubernetes.io/projected/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-kube-api-access-cgphd\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.881768 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-scripts\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.881853 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.888149 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.890965 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.914204 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-config-data\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.915081 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-scripts\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.923943 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgphd\" (UniqueName: \"kubernetes.io/projected/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-kube-api-access-cgphd\") pod \"nova-cell0-cell-mapping-fzmlg\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.984735 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2893d1b-5ad4-432b-964e-fa981929487a-logs\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.985006 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-config-data\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.985117 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.985222 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnjp2\" (UniqueName: \"kubernetes.io/projected/c2893d1b-5ad4-432b-964e-fa981929487a-kube-api-access-rnjp2\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.998084 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:20:25 crc kubenswrapper[4823]: I1216 07:20:25.999646 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.004540 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.037229 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.054422 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.063730 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.065306 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.069953 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.075253 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.087365 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.087579 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2893d1b-5ad4-432b-964e-fa981929487a-logs\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.087719 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5fkq\" (UniqueName: \"kubernetes.io/projected/3bdc8199-cc99-47c0-a271-7653cf92832e-kube-api-access-m5fkq\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.087848 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-config-data\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.088133 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.088255 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.088347 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnjp2\" (UniqueName: \"kubernetes.io/projected/c2893d1b-5ad4-432b-964e-fa981929487a-kube-api-access-rnjp2\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.088496 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2893d1b-5ad4-432b-964e-fa981929487a-logs\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.100936 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-config-data\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.101827 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.120938 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnjp2\" (UniqueName: \"kubernetes.io/projected/c2893d1b-5ad4-432b-964e-fa981929487a-kube-api-access-rnjp2\") pod \"nova-api-0\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.181093 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-lfdcp"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.182644 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.190777 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.190827 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5fkq\" (UniqueName: \"kubernetes.io/projected/3bdc8199-cc99-47c0-a271-7653cf92832e-kube-api-access-m5fkq\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.190959 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.191016 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-config-data\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.191146 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f31fcf-de71-43dc-a94c-11ef850cf1e4-logs\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.191177 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6m2\" (UniqueName: \"kubernetes.io/projected/05f31fcf-de71-43dc-a94c-11ef850cf1e4-kube-api-access-2q6m2\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.191228 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.200380 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-lfdcp"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.215885 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.217260 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.225511 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.226065 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.228537 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.232782 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.236789 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5fkq\" (UniqueName: \"kubernetes.io/projected/3bdc8199-cc99-47c0-a271-7653cf92832e-kube-api-access-m5fkq\") pod \"nova-cell1-novncproxy-0\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.308404 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.309502 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.309663 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-config-data\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.309686 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.309846 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.309901 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-config\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.309946 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310079 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm4h2\" (UniqueName: \"kubernetes.io/projected/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-kube-api-access-bm4h2\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310223 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9pn\" (UniqueName: \"kubernetes.io/projected/719d0a32-c321-47de-8e64-ebb61884922d-kube-api-access-4t9pn\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310603 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310775 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-config-data\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310802 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f31fcf-de71-43dc-a94c-11ef850cf1e4-logs\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310854 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6m2\" (UniqueName: \"kubernetes.io/projected/05f31fcf-de71-43dc-a94c-11ef850cf1e4-kube-api-access-2q6m2\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.310882 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.320491 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f31fcf-de71-43dc-a94c-11ef850cf1e4-logs\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.328061 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.328466 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.338831 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6m2\" (UniqueName: \"kubernetes.io/projected/05f31fcf-de71-43dc-a94c-11ef850cf1e4-kube-api-access-2q6m2\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.409067 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-config-data\") pod \"nova-metadata-0\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.412478 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413679 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413701 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-config-data\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413719 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413761 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-config\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413798 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413818 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm4h2\" (UniqueName: \"kubernetes.io/projected/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-kube-api-access-bm4h2\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413838 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9pn\" (UniqueName: \"kubernetes.io/projected/719d0a32-c321-47de-8e64-ebb61884922d-kube-api-access-4t9pn\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.413890 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.414842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.415583 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.416267 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.419728 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.420564 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-config-data\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.422167 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-config\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.427788 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.441170 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9pn\" (UniqueName: \"kubernetes.io/projected/719d0a32-c321-47de-8e64-ebb61884922d-kube-api-access-4t9pn\") pod \"nova-scheduler-0\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.444823 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm4h2\" (UniqueName: \"kubernetes.io/projected/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-kube-api-access-bm4h2\") pod \"dnsmasq-dns-647df7b8c5-lfdcp\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.513279 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.628432 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.644450 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.659740 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.693332 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.721101 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzmlg"] Dec 16 07:20:26 crc kubenswrapper[4823]: W1216 07:20:26.802158 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10ce7b3_53e0_4318_b7a2_1d2a33b9eb3b.slice/crio-322ce5d15c0a97a23167188cfe3e14c826a5574d2dbfc74ddc13544f66631edf WatchSource:0}: Error finding container 322ce5d15c0a97a23167188cfe3e14c826a5574d2dbfc74ddc13544f66631edf: Status 404 returned error can't find the container with id 322ce5d15c0a97a23167188cfe3e14c826a5574d2dbfc74ddc13544f66631edf Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.826239 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5855z"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.884182 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerStarted","Data":"20b846e2d51c83fa289005416400c0a2d9d523ade2f29770a1d2db4efcf5961c"} Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.885430 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.899198 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzmlg" event={"ID":"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b","Type":"ContainerStarted","Data":"322ce5d15c0a97a23167188cfe3e14c826a5574d2dbfc74ddc13544f66631edf"} Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.930290 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:26 crc kubenswrapper[4823]: I1216 07:20:26.933681 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.365855147 podStartE2EDuration="5.933660101s" podCreationTimestamp="2025-12-16 07:20:21 +0000 UTC" firstStartedPulling="2025-12-16 07:20:22.716624003 +0000 UTC m=+1501.205190136" lastFinishedPulling="2025-12-16 07:20:26.284428967 +0000 UTC m=+1504.772995090" observedRunningTime="2025-12-16 07:20:26.915946077 +0000 UTC m=+1505.404512200" watchObservedRunningTime="2025-12-16 07:20:26.933660101 +0000 UTC m=+1505.422226224" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.149416 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.294458 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-lfdcp"] Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.401738 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c9snq"] Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.403521 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.408500 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.408585 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.425406 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.433729 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c9snq"] Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.439830 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px55s\" (UniqueName: \"kubernetes.io/projected/4f568c50-222d-46ec-8b2b-d9605d6ace8a-kube-api-access-px55s\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.439969 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-scripts\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.440058 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-config-data\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.440189 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.515009 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.541524 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-scripts\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.541583 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-config-data\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.541659 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.541718 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px55s\" (UniqueName: \"kubernetes.io/projected/4f568c50-222d-46ec-8b2b-d9605d6ace8a-kube-api-access-px55s\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.546178 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-scripts\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.546289 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-config-data\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.546844 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.560495 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px55s\" (UniqueName: \"kubernetes.io/projected/4f568c50-222d-46ec-8b2b-d9605d6ace8a-kube-api-access-px55s\") pod \"nova-cell1-conductor-db-sync-c9snq\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.731097 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:27 crc kubenswrapper[4823]: I1216 07:20:27.974857 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzmlg" event={"ID":"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b","Type":"ContainerStarted","Data":"07a9d0c25e6eeab6239ccf65db9c887bfc778f97b4b49626f4c06ae1fafb22b5"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.004174 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fzmlg" podStartSLOduration=3.00415872 podStartE2EDuration="3.00415872s" podCreationTimestamp="2025-12-16 07:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:28.001472385 +0000 UTC m=+1506.490038508" watchObservedRunningTime="2025-12-16 07:20:28.00415872 +0000 UTC m=+1506.492724923" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.005378 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2893d1b-5ad4-432b-964e-fa981929487a","Type":"ContainerStarted","Data":"e358006e6eb460fd9fe7c9dd59101298f914a1244fc3544b14ff7109e1f7eb59"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.027342 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"719d0a32-c321-47de-8e64-ebb61884922d","Type":"ContainerStarted","Data":"bbe47a48f93d7b246ebe3b5ec97fcc5e7aa43618b7973cba220f28f1877c58bd"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.029039 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3bdc8199-cc99-47c0-a271-7653cf92832e","Type":"ContainerStarted","Data":"98fdc86d2c70e40cb779803c096f218d5b06169fec19211d33f39bb35a6813a5"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.048000 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05f31fcf-de71-43dc-a94c-11ef850cf1e4","Type":"ContainerStarted","Data":"740bfd7c5d4bd470eedf9a35591ffa4a893ba1b199247567cc0102dfe14fa190"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.065205 4823 generic.go:334] "Generic (PLEG): container finished" podID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerID="d0c12acae64a11345b532f0613a2afb372cb1aeb0df6c6420c1b6b7634f2ade4" exitCode=0 Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.065444 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5855z" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="registry-server" containerID="cri-o://b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e" gracePeriod=2 Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.065855 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" event={"ID":"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa","Type":"ContainerDied","Data":"d0c12acae64a11345b532f0613a2afb372cb1aeb0df6c6420c1b6b7634f2ade4"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.065916 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" event={"ID":"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa","Type":"ContainerStarted","Data":"b2476c086cbe3a7d8ce3a52a1933a4ebdbeccda9029a293fbd426eec90fdff8e"} Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.135207 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.135274 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.346190 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c9snq"] Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.573195 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.683301 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxjs9\" (UniqueName: \"kubernetes.io/projected/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-kube-api-access-hxjs9\") pod \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.683719 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-catalog-content\") pod \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.683747 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-utilities\") pod \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\" (UID: \"9fa50495-02f1-4a8d-a65a-8f28ebedaa92\") " Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.688579 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-kube-api-access-hxjs9" (OuterVolumeSpecName: "kube-api-access-hxjs9") pod "9fa50495-02f1-4a8d-a65a-8f28ebedaa92" (UID: "9fa50495-02f1-4a8d-a65a-8f28ebedaa92"). InnerVolumeSpecName "kube-api-access-hxjs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.693769 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-utilities" (OuterVolumeSpecName: "utilities") pod "9fa50495-02f1-4a8d-a65a-8f28ebedaa92" (UID: "9fa50495-02f1-4a8d-a65a-8f28ebedaa92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.748880 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fa50495-02f1-4a8d-a65a-8f28ebedaa92" (UID: "9fa50495-02f1-4a8d-a65a-8f28ebedaa92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.786282 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.786319 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:28 crc kubenswrapper[4823]: I1216 07:20:28.786329 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxjs9\" (UniqueName: \"kubernetes.io/projected/9fa50495-02f1-4a8d-a65a-8f28ebedaa92-kube-api-access-hxjs9\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.082367 4823 generic.go:334] "Generic (PLEG): container finished" podID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerID="b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e" exitCode=0 Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.082519 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5855z" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.083093 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerDied","Data":"b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e"} Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.083161 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5855z" event={"ID":"9fa50495-02f1-4a8d-a65a-8f28ebedaa92","Type":"ContainerDied","Data":"af60c4150f4400ea9af4ffcfe91fa37e307504200a4d944d5dc2b8072c44dd0e"} Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.083186 4823 scope.go:117] "RemoveContainer" containerID="b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.087737 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" event={"ID":"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa","Type":"ContainerStarted","Data":"f5027690ff5170201f3250e3d31dfbd810fe793d136225ccf387795cb8773c20"} Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.088097 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.095442 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c9snq" event={"ID":"4f568c50-222d-46ec-8b2b-d9605d6ace8a","Type":"ContainerStarted","Data":"8c0b7b84afb786dddbed1b11c3ecd557ca15d8f1a49d5fef28e57f3520fd1ec6"} Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.095480 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c9snq" event={"ID":"4f568c50-222d-46ec-8b2b-d9605d6ace8a","Type":"ContainerStarted","Data":"1dff03057dd51f22e837255516db41b8b0b9ba9a9739d94c551829c12e35634c"} Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.115654 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" podStartSLOduration=3.115627241 podStartE2EDuration="3.115627241s" podCreationTimestamp="2025-12-16 07:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:29.105548555 +0000 UTC m=+1507.594114678" watchObservedRunningTime="2025-12-16 07:20:29.115627241 +0000 UTC m=+1507.604193364" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.122113 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-c9snq" podStartSLOduration=2.122094954 podStartE2EDuration="2.122094954s" podCreationTimestamp="2025-12-16 07:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:29.121093252 +0000 UTC m=+1507.609659395" watchObservedRunningTime="2025-12-16 07:20:29.122094954 +0000 UTC m=+1507.610661077" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.150137 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5855z"] Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.163043 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5855z"] Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.573332 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kprln"] Dec 16 07:20:29 crc kubenswrapper[4823]: E1216 07:20:29.574135 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="registry-server" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.574154 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="registry-server" Dec 16 07:20:29 crc kubenswrapper[4823]: E1216 07:20:29.574178 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="extract-content" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.574186 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="extract-content" Dec 16 07:20:29 crc kubenswrapper[4823]: E1216 07:20:29.574207 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="extract-utilities" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.574217 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="extract-utilities" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.574452 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" containerName="registry-server" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.576360 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.597462 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kprln"] Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.604293 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-utilities\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.604429 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9s8w\" (UniqueName: \"kubernetes.io/projected/822a5f46-f8d5-4d0b-8e78-c5385552f353-kube-api-access-m9s8w\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.604522 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-catalog-content\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.706177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-catalog-content\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.706292 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-utilities\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.706369 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9s8w\" (UniqueName: \"kubernetes.io/projected/822a5f46-f8d5-4d0b-8e78-c5385552f353-kube-api-access-m9s8w\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.706812 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-catalog-content\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.706832 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-utilities\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.750045 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9s8w\" (UniqueName: \"kubernetes.io/projected/822a5f46-f8d5-4d0b-8e78-c5385552f353-kube-api-access-m9s8w\") pod \"community-operators-kprln\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.787895 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa50495-02f1-4a8d-a65a-8f28ebedaa92" path="/var/lib/kubelet/pods/9fa50495-02f1-4a8d-a65a-8f28ebedaa92/volumes" Dec 16 07:20:29 crc kubenswrapper[4823]: I1216 07:20:29.900841 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:30 crc kubenswrapper[4823]: I1216 07:20:30.081063 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:20:30 crc kubenswrapper[4823]: I1216 07:20:30.089215 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.383044 4823 scope.go:117] "RemoveContainer" containerID="a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.470654 4823 scope.go:117] "RemoveContainer" containerID="5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.761138 4823 scope.go:117] "RemoveContainer" containerID="b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e" Dec 16 07:20:31 crc kubenswrapper[4823]: E1216 07:20:31.763574 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e\": container with ID starting with b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e not found: ID does not exist" containerID="b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.763612 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e"} err="failed to get container status \"b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e\": rpc error: code = NotFound desc = could not find container \"b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e\": container with ID starting with b4dc436043d89179184d1f034863103dde286650864660b5bece9a67cbc9077e not found: ID does not exist" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.763636 4823 scope.go:117] "RemoveContainer" containerID="a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470" Dec 16 07:20:31 crc kubenswrapper[4823]: E1216 07:20:31.766103 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470\": container with ID starting with a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470 not found: ID does not exist" containerID="a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.766140 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470"} err="failed to get container status \"a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470\": rpc error: code = NotFound desc = could not find container \"a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470\": container with ID starting with a687b5919468b3772d87d1ca4c7eb55dd57c24f15ce4a3d8849269f768ce5470 not found: ID does not exist" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.766161 4823 scope.go:117] "RemoveContainer" containerID="5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b" Dec 16 07:20:31 crc kubenswrapper[4823]: E1216 07:20:31.767298 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b\": container with ID starting with 5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b not found: ID does not exist" containerID="5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b" Dec 16 07:20:31 crc kubenswrapper[4823]: I1216 07:20:31.767323 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b"} err="failed to get container status \"5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b\": rpc error: code = NotFound desc = could not find container \"5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b\": container with ID starting with 5e71505e703b0f231c389f42b881b6695200a645a9f187b96bb2999e37242c2b not found: ID does not exist" Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.049450 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kprln"] Dec 16 07:20:32 crc kubenswrapper[4823]: W1216 07:20:32.068234 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822a5f46_f8d5_4d0b_8e78_c5385552f353.slice/crio-8ebda4a2d745342591f06048455cdd7d8b91296abbdd3d1374c3f189d54ea3c6 WatchSource:0}: Error finding container 8ebda4a2d745342591f06048455cdd7d8b91296abbdd3d1374c3f189d54ea3c6: Status 404 returned error can't find the container with id 8ebda4a2d745342591f06048455cdd7d8b91296abbdd3d1374c3f189d54ea3c6 Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.138373 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"719d0a32-c321-47de-8e64-ebb61884922d","Type":"ContainerStarted","Data":"ac2e538e0b5f4fa4998a5cbe9a6f0f6fe2f6c02b0e03d8a99f9b04a5e6c00229"} Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.155702 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3bdc8199-cc99-47c0-a271-7653cf92832e","Type":"ContainerStarted","Data":"ac51929429ff0dc494a7f7bf47c48ab01b7d2dee92724a05c9ab6ca16fab3c14"} Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.155863 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3bdc8199-cc99-47c0-a271-7653cf92832e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ac51929429ff0dc494a7f7bf47c48ab01b7d2dee92724a05c9ab6ca16fab3c14" gracePeriod=30 Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.157476 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.224058926 podStartE2EDuration="6.157455551s" podCreationTimestamp="2025-12-16 07:20:26 +0000 UTC" firstStartedPulling="2025-12-16 07:20:27.522556956 +0000 UTC m=+1506.011123079" lastFinishedPulling="2025-12-16 07:20:31.455953591 +0000 UTC m=+1509.944519704" observedRunningTime="2025-12-16 07:20:32.154754707 +0000 UTC m=+1510.643320830" watchObservedRunningTime="2025-12-16 07:20:32.157455551 +0000 UTC m=+1510.646021674" Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.176579 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.868461968 podStartE2EDuration="7.176558609s" podCreationTimestamp="2025-12-16 07:20:25 +0000 UTC" firstStartedPulling="2025-12-16 07:20:27.147429156 +0000 UTC m=+1505.635995279" lastFinishedPulling="2025-12-16 07:20:31.455525797 +0000 UTC m=+1509.944091920" observedRunningTime="2025-12-16 07:20:32.169245621 +0000 UTC m=+1510.657811744" watchObservedRunningTime="2025-12-16 07:20:32.176558609 +0000 UTC m=+1510.665124732" Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.178844 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05f31fcf-de71-43dc-a94c-11ef850cf1e4","Type":"ContainerStarted","Data":"0a550f17b17ac9b38481c402ef52c613adb84141592a677e22ded679a0bbc92f"} Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.179000 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-log" containerID="cri-o://0a550f17b17ac9b38481c402ef52c613adb84141592a677e22ded679a0bbc92f" gracePeriod=30 Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.179106 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-metadata" containerID="cri-o://d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec" gracePeriod=30 Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.189296 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerStarted","Data":"8ebda4a2d745342591f06048455cdd7d8b91296abbdd3d1374c3f189d54ea3c6"} Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.194623 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2893d1b-5ad4-432b-964e-fa981929487a","Type":"ContainerStarted","Data":"41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5"} Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.198559 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.159989041 podStartE2EDuration="7.198545389s" podCreationTimestamp="2025-12-16 07:20:25 +0000 UTC" firstStartedPulling="2025-12-16 07:20:27.435892651 +0000 UTC m=+1505.924458774" lastFinishedPulling="2025-12-16 07:20:31.474448999 +0000 UTC m=+1509.963015122" observedRunningTime="2025-12-16 07:20:32.195818464 +0000 UTC m=+1510.684384587" watchObservedRunningTime="2025-12-16 07:20:32.198545389 +0000 UTC m=+1510.687111512" Dec 16 07:20:32 crc kubenswrapper[4823]: I1216 07:20:32.224093 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.746649833 podStartE2EDuration="7.224072367s" podCreationTimestamp="2025-12-16 07:20:25 +0000 UTC" firstStartedPulling="2025-12-16 07:20:26.978359771 +0000 UTC m=+1505.466925894" lastFinishedPulling="2025-12-16 07:20:31.455782305 +0000 UTC m=+1509.944348428" observedRunningTime="2025-12-16 07:20:32.215390656 +0000 UTC m=+1510.703956779" watchObservedRunningTime="2025-12-16 07:20:32.224072367 +0000 UTC m=+1510.712638490" Dec 16 07:20:33 crc kubenswrapper[4823]: I1216 07:20:33.205608 4823 generic.go:334] "Generic (PLEG): container finished" podID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerID="80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8" exitCode=0 Dec 16 07:20:33 crc kubenswrapper[4823]: I1216 07:20:33.205663 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerDied","Data":"80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8"} Dec 16 07:20:33 crc kubenswrapper[4823]: I1216 07:20:33.208936 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2893d1b-5ad4-432b-964e-fa981929487a","Type":"ContainerStarted","Data":"2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca"} Dec 16 07:20:33 crc kubenswrapper[4823]: I1216 07:20:33.212999 4823 generic.go:334] "Generic (PLEG): container finished" podID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerID="0a550f17b17ac9b38481c402ef52c613adb84141592a677e22ded679a0bbc92f" exitCode=143 Dec 16 07:20:33 crc kubenswrapper[4823]: I1216 07:20:33.213656 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05f31fcf-de71-43dc-a94c-11ef850cf1e4","Type":"ContainerDied","Data":"0a550f17b17ac9b38481c402ef52c613adb84141592a677e22ded679a0bbc92f"} Dec 16 07:20:33 crc kubenswrapper[4823]: I1216 07:20:33.213703 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05f31fcf-de71-43dc-a94c-11ef850cf1e4","Type":"ContainerStarted","Data":"d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec"} Dec 16 07:20:34 crc kubenswrapper[4823]: I1216 07:20:34.222892 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerStarted","Data":"0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110"} Dec 16 07:20:35 crc kubenswrapper[4823]: I1216 07:20:35.234129 4823 generic.go:334] "Generic (PLEG): container finished" podID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerID="0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110" exitCode=0 Dec 16 07:20:35 crc kubenswrapper[4823]: I1216 07:20:35.234371 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerDied","Data":"0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110"} Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.246217 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerStarted","Data":"c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad"} Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.248658 4823 generic.go:334] "Generic (PLEG): container finished" podID="f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" containerID="07a9d0c25e6eeab6239ccf65db9c887bfc778f97b4b49626f4c06ae1fafb22b5" exitCode=0 Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.248689 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzmlg" event={"ID":"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b","Type":"ContainerDied","Data":"07a9d0c25e6eeab6239ccf65db9c887bfc778f97b4b49626f4c06ae1fafb22b5"} Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.268887 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kprln" podStartSLOduration=4.813810979 podStartE2EDuration="7.268871972s" podCreationTimestamp="2025-12-16 07:20:29 +0000 UTC" firstStartedPulling="2025-12-16 07:20:33.20878421 +0000 UTC m=+1511.697350343" lastFinishedPulling="2025-12-16 07:20:35.663845213 +0000 UTC m=+1514.152411336" observedRunningTime="2025-12-16 07:20:36.261645525 +0000 UTC m=+1514.750211668" watchObservedRunningTime="2025-12-16 07:20:36.268871972 +0000 UTC m=+1514.757438085" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.309316 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.309646 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.328797 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.630166 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.645166 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.645233 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.694680 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.695000 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.697057 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-zxsbt"] Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.697324 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerName="dnsmasq-dns" containerID="cri-o://66fb8cb9dc2bdcafdd3f90d87593590c0679946f3ccdc9113b87fb499e690755" gracePeriod=10 Dec 16 07:20:36 crc kubenswrapper[4823]: I1216 07:20:36.702864 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.265474 4823 generic.go:334] "Generic (PLEG): container finished" podID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerID="66fb8cb9dc2bdcafdd3f90d87593590c0679946f3ccdc9113b87fb499e690755" exitCode=0 Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.265908 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" event={"ID":"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512","Type":"ContainerDied","Data":"66fb8cb9dc2bdcafdd3f90d87593590c0679946f3ccdc9113b87fb499e690755"} Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.266808 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" event={"ID":"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512","Type":"ContainerDied","Data":"3b938c5353d77ba2e003d77b926ed4311cd8b81b8961f00030320fb101dc9baa"} Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.266830 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b938c5353d77ba2e003d77b926ed4311cd8b81b8961f00030320fb101dc9baa" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.305452 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.310165 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.378175 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-swift-storage-0\") pod \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.378300 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-config\") pod \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.378335 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-sb\") pod \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.378444 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-nb\") pod \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.378482 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8tlt\" (UniqueName: \"kubernetes.io/projected/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-kube-api-access-q8tlt\") pod \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.378626 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-svc\") pod \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\" (UID: \"52fb8160-a1e0-4b7e-a3ce-bd018dc8c512\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.396902 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-kube-api-access-q8tlt" (OuterVolumeSpecName: "kube-api-access-q8tlt") pod "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" (UID: "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512"). InnerVolumeSpecName "kube-api-access-q8tlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.399337 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.399694 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.475631 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-config" (OuterVolumeSpecName: "config") pod "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" (UID: "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.482352 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.482387 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8tlt\" (UniqueName: \"kubernetes.io/projected/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-kube-api-access-q8tlt\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.520508 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" (UID: "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.527711 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" (UID: "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.545790 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" (UID: "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.550562 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" (UID: "52fb8160-a1e0-4b7e-a3ce-bd018dc8c512"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.584749 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.584782 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.584792 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.584800 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.667668 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.685262 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgphd\" (UniqueName: \"kubernetes.io/projected/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-kube-api-access-cgphd\") pod \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.685327 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-config-data\") pod \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.685416 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-combined-ca-bundle\") pod \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.685443 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-scripts\") pod \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\" (UID: \"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b\") " Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.704318 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-kube-api-access-cgphd" (OuterVolumeSpecName: "kube-api-access-cgphd") pod "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" (UID: "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b"). InnerVolumeSpecName "kube-api-access-cgphd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.768196 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-scripts" (OuterVolumeSpecName: "scripts") pod "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" (UID: "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.800554 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgphd\" (UniqueName: \"kubernetes.io/projected/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-kube-api-access-cgphd\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.800599 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.832229 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" (UID: "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.874221 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-config-data" (OuterVolumeSpecName: "config-data") pod "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" (UID: "f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.903328 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:37 crc kubenswrapper[4823]: I1216 07:20:37.903367 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.277209 4823 generic.go:334] "Generic (PLEG): container finished" podID="4f568c50-222d-46ec-8b2b-d9605d6ace8a" containerID="8c0b7b84afb786dddbed1b11c3ecd557ca15d8f1a49d5fef28e57f3520fd1ec6" exitCode=0 Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.277297 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c9snq" event={"ID":"4f568c50-222d-46ec-8b2b-d9605d6ace8a","Type":"ContainerDied","Data":"8c0b7b84afb786dddbed1b11c3ecd557ca15d8f1a49d5fef28e57f3520fd1ec6"} Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.279521 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-zxsbt" Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.279570 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzmlg" Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.279518 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzmlg" event={"ID":"f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b","Type":"ContainerDied","Data":"322ce5d15c0a97a23167188cfe3e14c826a5574d2dbfc74ddc13544f66631edf"} Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.279682 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="322ce5d15c0a97a23167188cfe3e14c826a5574d2dbfc74ddc13544f66631edf" Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.331251 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-zxsbt"] Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.341218 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-zxsbt"] Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.462727 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.474073 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.474328 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-log" containerID="cri-o://41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5" gracePeriod=30 Dec 16 07:20:38 crc kubenswrapper[4823]: I1216 07:20:38.474359 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-api" containerID="cri-o://2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca" gracePeriod=30 Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.289642 4823 generic.go:334] "Generic (PLEG): container finished" podID="c2893d1b-5ad4-432b-964e-fa981929487a" containerID="41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5" exitCode=143 Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.289788 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="719d0a32-c321-47de-8e64-ebb61884922d" containerName="nova-scheduler-scheduler" containerID="cri-o://ac2e538e0b5f4fa4998a5cbe9a6f0f6fe2f6c02b0e03d8a99f9b04a5e6c00229" gracePeriod=30 Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.290639 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2893d1b-5ad4-432b-964e-fa981929487a","Type":"ContainerDied","Data":"41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5"} Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.736449 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.785346 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" path="/var/lib/kubelet/pods/52fb8160-a1e0-4b7e-a3ce-bd018dc8c512/volumes" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.901938 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.902073 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.938989 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-config-data\") pod \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.939088 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px55s\" (UniqueName: \"kubernetes.io/projected/4f568c50-222d-46ec-8b2b-d9605d6ace8a-kube-api-access-px55s\") pod \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.939653 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-combined-ca-bundle\") pod \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.939702 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-scripts\") pod \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\" (UID: \"4f568c50-222d-46ec-8b2b-d9605d6ace8a\") " Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.945986 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f568c50-222d-46ec-8b2b-d9605d6ace8a-kube-api-access-px55s" (OuterVolumeSpecName: "kube-api-access-px55s") pod "4f568c50-222d-46ec-8b2b-d9605d6ace8a" (UID: "4f568c50-222d-46ec-8b2b-d9605d6ace8a"). InnerVolumeSpecName "kube-api-access-px55s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.948087 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-scripts" (OuterVolumeSpecName: "scripts") pod "4f568c50-222d-46ec-8b2b-d9605d6ace8a" (UID: "4f568c50-222d-46ec-8b2b-d9605d6ace8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.974824 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.976456 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-config-data" (OuterVolumeSpecName: "config-data") pod "4f568c50-222d-46ec-8b2b-d9605d6ace8a" (UID: "4f568c50-222d-46ec-8b2b-d9605d6ace8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:39 crc kubenswrapper[4823]: I1216 07:20:39.978200 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f568c50-222d-46ec-8b2b-d9605d6ace8a" (UID: "4f568c50-222d-46ec-8b2b-d9605d6ace8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.041477 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.041519 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.041532 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f568c50-222d-46ec-8b2b-d9605d6ace8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.041542 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px55s\" (UniqueName: \"kubernetes.io/projected/4f568c50-222d-46ec-8b2b-d9605d6ace8a-kube-api-access-px55s\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.299360 4823 generic.go:334] "Generic (PLEG): container finished" podID="719d0a32-c321-47de-8e64-ebb61884922d" containerID="ac2e538e0b5f4fa4998a5cbe9a6f0f6fe2f6c02b0e03d8a99f9b04a5e6c00229" exitCode=0 Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.299463 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"719d0a32-c321-47de-8e64-ebb61884922d","Type":"ContainerDied","Data":"ac2e538e0b5f4fa4998a5cbe9a6f0f6fe2f6c02b0e03d8a99f9b04a5e6c00229"} Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.302014 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c9snq" event={"ID":"4f568c50-222d-46ec-8b2b-d9605d6ace8a","Type":"ContainerDied","Data":"1dff03057dd51f22e837255516db41b8b0b9ba9a9739d94c551829c12e35634c"} Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.302060 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dff03057dd51f22e837255516db41b8b0b9ba9a9739d94c551829c12e35634c" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.302081 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c9snq" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.359748 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370162 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:20:40 crc kubenswrapper[4823]: E1216 07:20:40.370568 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" containerName="nova-manage" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370579 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" containerName="nova-manage" Dec 16 07:20:40 crc kubenswrapper[4823]: E1216 07:20:40.370596 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f568c50-222d-46ec-8b2b-d9605d6ace8a" containerName="nova-cell1-conductor-db-sync" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370603 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f568c50-222d-46ec-8b2b-d9605d6ace8a" containerName="nova-cell1-conductor-db-sync" Dec 16 07:20:40 crc kubenswrapper[4823]: E1216 07:20:40.370614 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerName="dnsmasq-dns" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370621 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerName="dnsmasq-dns" Dec 16 07:20:40 crc kubenswrapper[4823]: E1216 07:20:40.370645 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerName="init" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370651 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerName="init" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370831 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f568c50-222d-46ec-8b2b-d9605d6ace8a" containerName="nova-cell1-conductor-db-sync" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370841 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="52fb8160-a1e0-4b7e-a3ce-bd018dc8c512" containerName="dnsmasq-dns" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.370854 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" containerName="nova-manage" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.372110 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.375464 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.379379 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.447975 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfk5f\" (UniqueName: \"kubernetes.io/projected/79a24114-2ee1-4cc0-9045-770fcf074950-kube-api-access-zfk5f\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.448107 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.448215 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.550648 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfk5f\" (UniqueName: \"kubernetes.io/projected/79a24114-2ee1-4cc0-9045-770fcf074950-kube-api-access-zfk5f\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.550756 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.550826 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.557601 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.558353 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.569636 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfk5f\" (UniqueName: \"kubernetes.io/projected/79a24114-2ee1-4cc0-9045-770fcf074950-kube-api-access-zfk5f\") pod \"nova-cell1-conductor-0\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.748249 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:40 crc kubenswrapper[4823]: I1216 07:20:40.939269 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.064930 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-config-data\") pod \"719d0a32-c321-47de-8e64-ebb61884922d\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.065165 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t9pn\" (UniqueName: \"kubernetes.io/projected/719d0a32-c321-47de-8e64-ebb61884922d-kube-api-access-4t9pn\") pod \"719d0a32-c321-47de-8e64-ebb61884922d\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.065250 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-combined-ca-bundle\") pod \"719d0a32-c321-47de-8e64-ebb61884922d\" (UID: \"719d0a32-c321-47de-8e64-ebb61884922d\") " Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.074755 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719d0a32-c321-47de-8e64-ebb61884922d-kube-api-access-4t9pn" (OuterVolumeSpecName: "kube-api-access-4t9pn") pod "719d0a32-c321-47de-8e64-ebb61884922d" (UID: "719d0a32-c321-47de-8e64-ebb61884922d"). InnerVolumeSpecName "kube-api-access-4t9pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.103432 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719d0a32-c321-47de-8e64-ebb61884922d" (UID: "719d0a32-c321-47de-8e64-ebb61884922d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.103629 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-config-data" (OuterVolumeSpecName: "config-data") pod "719d0a32-c321-47de-8e64-ebb61884922d" (UID: "719d0a32-c321-47de-8e64-ebb61884922d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.167839 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.168165 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719d0a32-c321-47de-8e64-ebb61884922d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.168176 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t9pn\" (UniqueName: \"kubernetes.io/projected/719d0a32-c321-47de-8e64-ebb61884922d-kube-api-access-4t9pn\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.264160 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:20:41 crc kubenswrapper[4823]: W1216 07:20:41.266014 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79a24114_2ee1_4cc0_9045_770fcf074950.slice/crio-16e9052b9673da895f792777b9356bdea60548993b3b320610c25c067da7b775 WatchSource:0}: Error finding container 16e9052b9673da895f792777b9356bdea60548993b3b320610c25c067da7b775: Status 404 returned error can't find the container with id 16e9052b9673da895f792777b9356bdea60548993b3b320610c25c067da7b775 Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.315067 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"719d0a32-c321-47de-8e64-ebb61884922d","Type":"ContainerDied","Data":"bbe47a48f93d7b246ebe3b5ec97fcc5e7aa43618b7973cba220f28f1877c58bd"} Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.315139 4823 scope.go:117] "RemoveContainer" containerID="ac2e538e0b5f4fa4998a5cbe9a6f0f6fe2f6c02b0e03d8a99f9b04a5e6c00229" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.315082 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.320251 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"79a24114-2ee1-4cc0-9045-770fcf074950","Type":"ContainerStarted","Data":"16e9052b9673da895f792777b9356bdea60548993b3b320610c25c067da7b775"} Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.359749 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kprln"] Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.373140 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.384201 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.394879 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:41 crc kubenswrapper[4823]: E1216 07:20:41.395389 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719d0a32-c321-47de-8e64-ebb61884922d" containerName="nova-scheduler-scheduler" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.395415 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="719d0a32-c321-47de-8e64-ebb61884922d" containerName="nova-scheduler-scheduler" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.395683 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="719d0a32-c321-47de-8e64-ebb61884922d" containerName="nova-scheduler-scheduler" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.396484 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.399122 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.411103 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.575220 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.575337 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvx6\" (UniqueName: \"kubernetes.io/projected/9c5cc73c-9d84-405e-b093-d6c721a739c8-kube-api-access-gvvx6\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.575424 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-config-data\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.676938 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.677385 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvx6\" (UniqueName: \"kubernetes.io/projected/9c5cc73c-9d84-405e-b093-d6c721a739c8-kube-api-access-gvvx6\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.677475 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-config-data\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.683080 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.684101 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-config-data\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.695655 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvx6\" (UniqueName: \"kubernetes.io/projected/9c5cc73c-9d84-405e-b093-d6c721a739c8-kube-api-access-gvvx6\") pod \"nova-scheduler-0\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.721832 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:20:41 crc kubenswrapper[4823]: I1216 07:20:41.787704 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719d0a32-c321-47de-8e64-ebb61884922d" path="/var/lib/kubelet/pods/719d0a32-c321-47de-8e64-ebb61884922d/volumes" Dec 16 07:20:42 crc kubenswrapper[4823]: I1216 07:20:42.253252 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:20:42 crc kubenswrapper[4823]: I1216 07:20:42.336036 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9c5cc73c-9d84-405e-b093-d6c721a739c8","Type":"ContainerStarted","Data":"483327e9f27dc555d169b453eeb0d6919819d632910c977c2e17de8adbaf3e6e"} Dec 16 07:20:42 crc kubenswrapper[4823]: I1216 07:20:42.357167 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"79a24114-2ee1-4cc0-9045-770fcf074950","Type":"ContainerStarted","Data":"e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3"} Dec 16 07:20:42 crc kubenswrapper[4823]: I1216 07:20:42.357401 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:42 crc kubenswrapper[4823]: I1216 07:20:42.385454 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.385433434 podStartE2EDuration="2.385433434s" podCreationTimestamp="2025-12-16 07:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:42.380939103 +0000 UTC m=+1520.869505216" watchObservedRunningTime="2025-12-16 07:20:42.385433434 +0000 UTC m=+1520.873999557" Dec 16 07:20:43 crc kubenswrapper[4823]: I1216 07:20:43.367545 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kprln" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="registry-server" containerID="cri-o://c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad" gracePeriod=2 Dec 16 07:20:43 crc kubenswrapper[4823]: I1216 07:20:43.367512 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9c5cc73c-9d84-405e-b093-d6c721a739c8","Type":"ContainerStarted","Data":"0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733"} Dec 16 07:20:43 crc kubenswrapper[4823]: I1216 07:20:43.394395 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.394376044 podStartE2EDuration="2.394376044s" podCreationTimestamp="2025-12-16 07:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:43.389168671 +0000 UTC m=+1521.877734804" watchObservedRunningTime="2025-12-16 07:20:43.394376044 +0000 UTC m=+1521.882942167" Dec 16 07:20:43 crc kubenswrapper[4823]: I1216 07:20:43.971445 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.138259 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-utilities\") pod \"822a5f46-f8d5-4d0b-8e78-c5385552f353\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.138554 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9s8w\" (UniqueName: \"kubernetes.io/projected/822a5f46-f8d5-4d0b-8e78-c5385552f353-kube-api-access-m9s8w\") pod \"822a5f46-f8d5-4d0b-8e78-c5385552f353\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.139775 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-catalog-content\") pod \"822a5f46-f8d5-4d0b-8e78-c5385552f353\" (UID: \"822a5f46-f8d5-4d0b-8e78-c5385552f353\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.139156 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-utilities" (OuterVolumeSpecName: "utilities") pod "822a5f46-f8d5-4d0b-8e78-c5385552f353" (UID: "822a5f46-f8d5-4d0b-8e78-c5385552f353"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.140596 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.149324 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822a5f46-f8d5-4d0b-8e78-c5385552f353-kube-api-access-m9s8w" (OuterVolumeSpecName: "kube-api-access-m9s8w") pod "822a5f46-f8d5-4d0b-8e78-c5385552f353" (UID: "822a5f46-f8d5-4d0b-8e78-c5385552f353"). InnerVolumeSpecName "kube-api-access-m9s8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.201266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "822a5f46-f8d5-4d0b-8e78-c5385552f353" (UID: "822a5f46-f8d5-4d0b-8e78-c5385552f353"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.241698 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9s8w\" (UniqueName: \"kubernetes.io/projected/822a5f46-f8d5-4d0b-8e78-c5385552f353-kube-api-access-m9s8w\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.241739 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822a5f46-f8d5-4d0b-8e78-c5385552f353-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.305857 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.354962 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2893d1b-5ad4-432b-964e-fa981929487a-logs\") pod \"c2893d1b-5ad4-432b-964e-fa981929487a\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.355064 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-config-data\") pod \"c2893d1b-5ad4-432b-964e-fa981929487a\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.355261 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-combined-ca-bundle\") pod \"c2893d1b-5ad4-432b-964e-fa981929487a\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.355331 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnjp2\" (UniqueName: \"kubernetes.io/projected/c2893d1b-5ad4-432b-964e-fa981929487a-kube-api-access-rnjp2\") pod \"c2893d1b-5ad4-432b-964e-fa981929487a\" (UID: \"c2893d1b-5ad4-432b-964e-fa981929487a\") " Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.357147 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2893d1b-5ad4-432b-964e-fa981929487a-logs" (OuterVolumeSpecName: "logs") pod "c2893d1b-5ad4-432b-964e-fa981929487a" (UID: "c2893d1b-5ad4-432b-964e-fa981929487a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.361310 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2893d1b-5ad4-432b-964e-fa981929487a-kube-api-access-rnjp2" (OuterVolumeSpecName: "kube-api-access-rnjp2") pod "c2893d1b-5ad4-432b-964e-fa981929487a" (UID: "c2893d1b-5ad4-432b-964e-fa981929487a"). InnerVolumeSpecName "kube-api-access-rnjp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.391745 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-config-data" (OuterVolumeSpecName: "config-data") pod "c2893d1b-5ad4-432b-964e-fa981929487a" (UID: "c2893d1b-5ad4-432b-964e-fa981929487a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.391929 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2893d1b-5ad4-432b-964e-fa981929487a" (UID: "c2893d1b-5ad4-432b-964e-fa981929487a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.399045 4823 generic.go:334] "Generic (PLEG): container finished" podID="c2893d1b-5ad4-432b-964e-fa981929487a" containerID="2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca" exitCode=0 Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.399099 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2893d1b-5ad4-432b-964e-fa981929487a","Type":"ContainerDied","Data":"2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca"} Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.399164 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2893d1b-5ad4-432b-964e-fa981929487a","Type":"ContainerDied","Data":"e358006e6eb460fd9fe7c9dd59101298f914a1244fc3544b14ff7109e1f7eb59"} Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.399118 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.399186 4823 scope.go:117] "RemoveContainer" containerID="2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.402694 4823 generic.go:334] "Generic (PLEG): container finished" podID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerID="c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad" exitCode=0 Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.403824 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kprln" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.404194 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerDied","Data":"c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad"} Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.404247 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kprln" event={"ID":"822a5f46-f8d5-4d0b-8e78-c5385552f353","Type":"ContainerDied","Data":"8ebda4a2d745342591f06048455cdd7d8b91296abbdd3d1374c3f189d54ea3c6"} Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.436873 4823 scope.go:117] "RemoveContainer" containerID="41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.445499 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kprln"] Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.457310 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnjp2\" (UniqueName: \"kubernetes.io/projected/c2893d1b-5ad4-432b-964e-fa981929487a-kube-api-access-rnjp2\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.457345 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2893d1b-5ad4-432b-964e-fa981929487a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.457381 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.457392 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2893d1b-5ad4-432b-964e-fa981929487a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.457850 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kprln"] Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.468542 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.488005 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.497990 4823 scope.go:117] "RemoveContainer" containerID="2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.505178 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca\": container with ID starting with 2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca not found: ID does not exist" containerID="2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.505254 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca"} err="failed to get container status \"2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca\": rpc error: code = NotFound desc = could not find container \"2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca\": container with ID starting with 2ff24c8b69871cbfc85b80c5aca37619b618e9916395e92eaf52df2f2dab18ca not found: ID does not exist" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.505281 4823 scope.go:117] "RemoveContainer" containerID="41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.507196 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5\": container with ID starting with 41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5 not found: ID does not exist" containerID="41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.507278 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5"} err="failed to get container status \"41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5\": rpc error: code = NotFound desc = could not find container \"41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5\": container with ID starting with 41afbf4a7eb85171756e9a962e57cd0e86d2e83651ac7f358cabac0ee33f7fe5 not found: ID does not exist" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.507337 4823 scope.go:117] "RemoveContainer" containerID="c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.513325 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.516081 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="registry-server" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516113 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="registry-server" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.516153 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-api" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516162 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-api" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.516189 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="extract-utilities" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516196 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="extract-utilities" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.516209 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-log" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516215 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-log" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.516244 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="extract-content" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516250 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="extract-content" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516646 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-api" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516689 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" containerName="nova-api-log" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.516734 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" containerName="registry-server" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.518385 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.526639 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.531255 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.559223 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d7e4d-f319-43f8-bf31-87b114fa7517-logs\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.559303 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-config-data\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.559333 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.559406 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxdl\" (UniqueName: \"kubernetes.io/projected/d20d7e4d-f319-43f8-bf31-87b114fa7517-kube-api-access-fxxdl\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.559757 4823 scope.go:117] "RemoveContainer" containerID="0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.580240 4823 scope.go:117] "RemoveContainer" containerID="80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.599783 4823 scope.go:117] "RemoveContainer" containerID="c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.600881 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad\": container with ID starting with c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad not found: ID does not exist" containerID="c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.600939 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad"} err="failed to get container status \"c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad\": rpc error: code = NotFound desc = could not find container \"c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad\": container with ID starting with c3604f4e821c897dc1cb5cf7e6beaaa120e1740e0ad00805b9d5177f80c463ad not found: ID does not exist" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.600973 4823 scope.go:117] "RemoveContainer" containerID="0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.602011 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110\": container with ID starting with 0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110 not found: ID does not exist" containerID="0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.602076 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110"} err="failed to get container status \"0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110\": rpc error: code = NotFound desc = could not find container \"0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110\": container with ID starting with 0d18db6a73c742367ced16771267babc1d9b4536dca1ef668e936f7d41dc8110 not found: ID does not exist" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.602104 4823 scope.go:117] "RemoveContainer" containerID="80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8" Dec 16 07:20:44 crc kubenswrapper[4823]: E1216 07:20:44.602329 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8\": container with ID starting with 80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8 not found: ID does not exist" containerID="80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.602358 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8"} err="failed to get container status \"80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8\": rpc error: code = NotFound desc = could not find container \"80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8\": container with ID starting with 80ea8742db042e42d3a2fdfb64330ae1dfd957e80706b34ed6dcffd2a00909a8 not found: ID does not exist" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.661229 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d7e4d-f319-43f8-bf31-87b114fa7517-logs\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.661506 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-config-data\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.661625 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.661770 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxdl\" (UniqueName: \"kubernetes.io/projected/d20d7e4d-f319-43f8-bf31-87b114fa7517-kube-api-access-fxxdl\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.662787 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d7e4d-f319-43f8-bf31-87b114fa7517-logs\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.665611 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.665775 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-config-data\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.679413 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxdl\" (UniqueName: \"kubernetes.io/projected/d20d7e4d-f319-43f8-bf31-87b114fa7517-kube-api-access-fxxdl\") pod \"nova-api-0\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " pod="openstack/nova-api-0" Dec 16 07:20:44 crc kubenswrapper[4823]: I1216 07:20:44.850849 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:20:45 crc kubenswrapper[4823]: I1216 07:20:45.291311 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:20:45 crc kubenswrapper[4823]: I1216 07:20:45.416785 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d20d7e4d-f319-43f8-bf31-87b114fa7517","Type":"ContainerStarted","Data":"ef012e164506db665546b697cf2a092893ebdf336790a65fb79e95034bf3039b"} Dec 16 07:20:45 crc kubenswrapper[4823]: I1216 07:20:45.784473 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822a5f46-f8d5-4d0b-8e78-c5385552f353" path="/var/lib/kubelet/pods/822a5f46-f8d5-4d0b-8e78-c5385552f353/volumes" Dec 16 07:20:45 crc kubenswrapper[4823]: I1216 07:20:45.785360 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2893d1b-5ad4-432b-964e-fa981929487a" path="/var/lib/kubelet/pods/c2893d1b-5ad4-432b-964e-fa981929487a/volumes" Dec 16 07:20:46 crc kubenswrapper[4823]: I1216 07:20:46.433294 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d20d7e4d-f319-43f8-bf31-87b114fa7517","Type":"ContainerStarted","Data":"898f21005519e2d3aa8dec31a7bf05da98a85c82a0b9743d0195cdf75205f2a5"} Dec 16 07:20:46 crc kubenswrapper[4823]: I1216 07:20:46.433340 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d20d7e4d-f319-43f8-bf31-87b114fa7517","Type":"ContainerStarted","Data":"7077901ca745fda9f1511b3ad5a0907f42f5d61f05e06175d316699713e8a2e8"} Dec 16 07:20:46 crc kubenswrapper[4823]: I1216 07:20:46.460143 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.460126054 podStartE2EDuration="2.460126054s" podCreationTimestamp="2025-12-16 07:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:20:46.457735999 +0000 UTC m=+1524.946302162" watchObservedRunningTime="2025-12-16 07:20:46.460126054 +0000 UTC m=+1524.948692187" Dec 16 07:20:46 crc kubenswrapper[4823]: I1216 07:20:46.722271 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 07:20:50 crc kubenswrapper[4823]: I1216 07:20:50.780055 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 07:20:51 crc kubenswrapper[4823]: I1216 07:20:51.722203 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 07:20:51 crc kubenswrapper[4823]: I1216 07:20:51.749787 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 07:20:52 crc kubenswrapper[4823]: I1216 07:20:52.251761 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 07:20:52 crc kubenswrapper[4823]: I1216 07:20:52.530888 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 07:20:54 crc kubenswrapper[4823]: I1216 07:20:54.851525 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:20:54 crc kubenswrapper[4823]: I1216 07:20:54.851585 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:20:55 crc kubenswrapper[4823]: I1216 07:20:55.759578 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:20:55 crc kubenswrapper[4823]: I1216 07:20:55.759823 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" containerName="kube-state-metrics" containerID="cri-o://84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff" gracePeriod=30 Dec 16 07:20:55 crc kubenswrapper[4823]: I1216 07:20:55.934262 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:20:55 crc kubenswrapper[4823]: I1216 07:20:55.934318 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.310767 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.476913 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xllpk\" (UniqueName: \"kubernetes.io/projected/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0-kube-api-access-xllpk\") pod \"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0\" (UID: \"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0\") " Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.483236 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0-kube-api-access-xllpk" (OuterVolumeSpecName: "kube-api-access-xllpk") pod "8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" (UID: "8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0"). InnerVolumeSpecName "kube-api-access-xllpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.545133 4823 generic.go:334] "Generic (PLEG): container finished" podID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" containerID="84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff" exitCode=2 Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.545233 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0","Type":"ContainerDied","Data":"84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff"} Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.545687 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.545702 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0","Type":"ContainerDied","Data":"a68940f057a874f65624bd9e9430a72529863e9fe47ff0ba2bb0d29c6db815ac"} Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.545761 4823 scope.go:117] "RemoveContainer" containerID="84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.566819 4823 scope.go:117] "RemoveContainer" containerID="84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff" Dec 16 07:20:56 crc kubenswrapper[4823]: E1216 07:20:56.567463 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff\": container with ID starting with 84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff not found: ID does not exist" containerID="84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.567528 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff"} err="failed to get container status \"84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff\": rpc error: code = NotFound desc = could not find container \"84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff\": container with ID starting with 84f5f9f54b8d2494731ff503a44dec71fc986fb565aff9085ad97bcea2c334ff not found: ID does not exist" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.579920 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xllpk\" (UniqueName: \"kubernetes.io/projected/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0-kube-api-access-xllpk\") on node \"crc\" DevicePath \"\"" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.583939 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.593807 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.613052 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:20:56 crc kubenswrapper[4823]: E1216 07:20:56.613592 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" containerName="kube-state-metrics" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.613613 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" containerName="kube-state-metrics" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.613858 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" containerName="kube-state-metrics" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.614692 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.617758 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.620112 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.626210 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.682148 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.682233 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.682418 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fcs\" (UniqueName: \"kubernetes.io/projected/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-api-access-n9fcs\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.682598 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.783377 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.783534 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.783577 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.783663 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fcs\" (UniqueName: \"kubernetes.io/projected/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-api-access-n9fcs\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.787592 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.791851 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.800927 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.801727 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fcs\" (UniqueName: \"kubernetes.io/projected/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-api-access-n9fcs\") pod \"kube-state-metrics-0\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " pod="openstack/kube-state-metrics-0" Dec 16 07:20:56 crc kubenswrapper[4823]: I1216 07:20:56.943646 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.439324 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.555759 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a9f0e08-d61e-4503-afc5-09cb29ff3175","Type":"ContainerStarted","Data":"ddece497f262c1e5208bb1692a45a5f74d43ab4d9560423c5e712470e3e5818e"} Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.785885 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" path="/var/lib/kubelet/pods/8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0/volumes" Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.817491 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.817772 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-central-agent" containerID="cri-o://05a5b9f8956ec4b6d4cff55811bfb65a933e6db7a0295b8f7c4bc94832dccb2b" gracePeriod=30 Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.817840 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="proxy-httpd" containerID="cri-o://20b846e2d51c83fa289005416400c0a2d9d523ade2f29770a1d2db4efcf5961c" gracePeriod=30 Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.817901 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-notification-agent" containerID="cri-o://dc3495df65d03d20434814e4df5e7c4ba019d019526109dd9bc388dad5bacca8" gracePeriod=30 Dec 16 07:20:57 crc kubenswrapper[4823]: I1216 07:20:57.817889 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="sg-core" containerID="cri-o://12cdd53f57280c98da4f4f1685a71705c5a0d3d19818af557a48834817456ea8" gracePeriod=30 Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.134294 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.134344 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.134406 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.135368 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.135445 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" gracePeriod=600 Dec 16 07:20:58 crc kubenswrapper[4823]: E1216 07:20:58.273606 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.568787 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a9f0e08-d61e-4503-afc5-09cb29ff3175","Type":"ContainerStarted","Data":"48f6096b95361df10996fa9107240728047380521cee4e036be0b67323319318"} Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.568891 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.571900 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" exitCode=0 Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.571966 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921"} Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.572051 4823 scope.go:117] "RemoveContainer" containerID="76342a6438b46c6d8e5101ee8ceb1df808db353230663e448e28ebb26272e882" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.572415 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:20:58 crc kubenswrapper[4823]: E1216 07:20:58.572657 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.583863 4823 generic.go:334] "Generic (PLEG): container finished" podID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerID="20b846e2d51c83fa289005416400c0a2d9d523ade2f29770a1d2db4efcf5961c" exitCode=0 Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.583893 4823 generic.go:334] "Generic (PLEG): container finished" podID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerID="12cdd53f57280c98da4f4f1685a71705c5a0d3d19818af557a48834817456ea8" exitCode=2 Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.583901 4823 generic.go:334] "Generic (PLEG): container finished" podID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerID="05a5b9f8956ec4b6d4cff55811bfb65a933e6db7a0295b8f7c4bc94832dccb2b" exitCode=0 Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.583922 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerDied","Data":"20b846e2d51c83fa289005416400c0a2d9d523ade2f29770a1d2db4efcf5961c"} Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.583947 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerDied","Data":"12cdd53f57280c98da4f4f1685a71705c5a0d3d19818af557a48834817456ea8"} Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.583956 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerDied","Data":"05a5b9f8956ec4b6d4cff55811bfb65a933e6db7a0295b8f7c4bc94832dccb2b"} Dec 16 07:20:58 crc kubenswrapper[4823]: I1216 07:20:58.591281 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.235762458 podStartE2EDuration="2.591255112s" podCreationTimestamp="2025-12-16 07:20:56 +0000 UTC" firstStartedPulling="2025-12-16 07:20:57.462868961 +0000 UTC m=+1535.951435084" lastFinishedPulling="2025-12-16 07:20:57.818361605 +0000 UTC m=+1536.306927738" observedRunningTime="2025-12-16 07:20:58.590717285 +0000 UTC m=+1537.079283408" watchObservedRunningTime="2025-12-16 07:20:58.591255112 +0000 UTC m=+1537.079821265" Dec 16 07:21:01 crc kubenswrapper[4823]: I1216 07:21:01.186128 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="8db2b8b4-03e8-4ae0-875d-5f3a6414d0e0" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:02 crc kubenswrapper[4823]: E1216 07:21:02.603333 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05f31fcf_de71_43dc_a94c_11ef850cf1e4.slice/crio-d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05f31fcf_de71_43dc_a94c_11ef850cf1e4.slice/crio-conmon-d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.619853 4823 generic.go:334] "Generic (PLEG): container finished" podID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerID="d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec" exitCode=137 Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.619931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05f31fcf-de71-43dc-a94c-11ef850cf1e4","Type":"ContainerDied","Data":"d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec"} Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.622949 4823 generic.go:334] "Generic (PLEG): container finished" podID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerID="dc3495df65d03d20434814e4df5e7c4ba019d019526109dd9bc388dad5bacca8" exitCode=0 Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.623068 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerDied","Data":"dc3495df65d03d20434814e4df5e7c4ba019d019526109dd9bc388dad5bacca8"} Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.624753 4823 generic.go:334] "Generic (PLEG): container finished" podID="3bdc8199-cc99-47c0-a271-7653cf92832e" containerID="ac51929429ff0dc494a7f7bf47c48ab01b7d2dee92724a05c9ab6ca16fab3c14" exitCode=137 Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.624811 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3bdc8199-cc99-47c0-a271-7653cf92832e","Type":"ContainerDied","Data":"ac51929429ff0dc494a7f7bf47c48ab01b7d2dee92724a05c9ab6ca16fab3c14"} Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.624834 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3bdc8199-cc99-47c0-a271-7653cf92832e","Type":"ContainerDied","Data":"98fdc86d2c70e40cb779803c096f218d5b06169fec19211d33f39bb35a6813a5"} Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.624844 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98fdc86d2c70e40cb779803c096f218d5b06169fec19211d33f39bb35a6813a5" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.701562 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.809608 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5fkq\" (UniqueName: \"kubernetes.io/projected/3bdc8199-cc99-47c0-a271-7653cf92832e-kube-api-access-m5fkq\") pod \"3bdc8199-cc99-47c0-a271-7653cf92832e\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.809694 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-combined-ca-bundle\") pod \"3bdc8199-cc99-47c0-a271-7653cf92832e\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.809855 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-config-data\") pod \"3bdc8199-cc99-47c0-a271-7653cf92832e\" (UID: \"3bdc8199-cc99-47c0-a271-7653cf92832e\") " Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.817262 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdc8199-cc99-47c0-a271-7653cf92832e-kube-api-access-m5fkq" (OuterVolumeSpecName: "kube-api-access-m5fkq") pod "3bdc8199-cc99-47c0-a271-7653cf92832e" (UID: "3bdc8199-cc99-47c0-a271-7653cf92832e"). InnerVolumeSpecName "kube-api-access-m5fkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.840529 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bdc8199-cc99-47c0-a271-7653cf92832e" (UID: "3bdc8199-cc99-47c0-a271-7653cf92832e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.846284 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-config-data" (OuterVolumeSpecName: "config-data") pod "3bdc8199-cc99-47c0-a271-7653cf92832e" (UID: "3bdc8199-cc99-47c0-a271-7653cf92832e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.886672 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.909708 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.915844 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.915878 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bdc8199-cc99-47c0-a271-7653cf92832e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:02 crc kubenswrapper[4823]: I1216 07:21:02.915888 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5fkq\" (UniqueName: \"kubernetes.io/projected/3bdc8199-cc99-47c0-a271-7653cf92832e-kube-api-access-m5fkq\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.017462 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-sg-core-conf-yaml\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.017807 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-combined-ca-bundle\") pod \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.017870 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6jqb\" (UniqueName: \"kubernetes.io/projected/41b2b49f-acfe-4019-a983-c9cea9de4378-kube-api-access-r6jqb\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.017940 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-log-httpd\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.017978 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-config-data\") pod \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.018189 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-run-httpd\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.018565 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-scripts\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.019725 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6m2\" (UniqueName: \"kubernetes.io/projected/05f31fcf-de71-43dc-a94c-11ef850cf1e4-kube-api-access-2q6m2\") pod \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.018631 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.018663 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.022235 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f31fcf-de71-43dc-a94c-11ef850cf1e4-logs\") pod \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\" (UID: \"05f31fcf-de71-43dc-a94c-11ef850cf1e4\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.022307 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-config-data\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.022361 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-combined-ca-bundle\") pod \"41b2b49f-acfe-4019-a983-c9cea9de4378\" (UID: \"41b2b49f-acfe-4019-a983-c9cea9de4378\") " Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.023517 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f31fcf-de71-43dc-a94c-11ef850cf1e4-logs" (OuterVolumeSpecName: "logs") pod "05f31fcf-de71-43dc-a94c-11ef850cf1e4" (UID: "05f31fcf-de71-43dc-a94c-11ef850cf1e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.024741 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.024769 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41b2b49f-acfe-4019-a983-c9cea9de4378-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.027531 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-scripts" (OuterVolumeSpecName: "scripts") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.040730 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b2b49f-acfe-4019-a983-c9cea9de4378-kube-api-access-r6jqb" (OuterVolumeSpecName: "kube-api-access-r6jqb") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "kube-api-access-r6jqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.045240 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f31fcf-de71-43dc-a94c-11ef850cf1e4-kube-api-access-2q6m2" (OuterVolumeSpecName: "kube-api-access-2q6m2") pod "05f31fcf-de71-43dc-a94c-11ef850cf1e4" (UID: "05f31fcf-de71-43dc-a94c-11ef850cf1e4"). InnerVolumeSpecName "kube-api-access-2q6m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.057038 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-config-data" (OuterVolumeSpecName: "config-data") pod "05f31fcf-de71-43dc-a94c-11ef850cf1e4" (UID: "05f31fcf-de71-43dc-a94c-11ef850cf1e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.061618 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05f31fcf-de71-43dc-a94c-11ef850cf1e4" (UID: "05f31fcf-de71-43dc-a94c-11ef850cf1e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.066631 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.103185 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126525 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126552 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126561 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126587 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6jqb\" (UniqueName: \"kubernetes.io/projected/41b2b49f-acfe-4019-a983-c9cea9de4378-kube-api-access-r6jqb\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126600 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f31fcf-de71-43dc-a94c-11ef850cf1e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126608 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126616 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6m2\" (UniqueName: \"kubernetes.io/projected/05f31fcf-de71-43dc-a94c-11ef850cf1e4-kube-api-access-2q6m2\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.126624 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f31fcf-de71-43dc-a94c-11ef850cf1e4-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.159141 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-config-data" (OuterVolumeSpecName: "config-data") pod "41b2b49f-acfe-4019-a983-c9cea9de4378" (UID: "41b2b49f-acfe-4019-a983-c9cea9de4378"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.228665 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b2b49f-acfe-4019-a983-c9cea9de4378-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.635097 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05f31fcf-de71-43dc-a94c-11ef850cf1e4","Type":"ContainerDied","Data":"740bfd7c5d4bd470eedf9a35591ffa4a893ba1b199247567cc0102dfe14fa190"} Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.635166 4823 scope.go:117] "RemoveContainer" containerID="d7813d1c91814e432a0eb6dcaf440aa3cdd98f1c31cee37f446714f48c9f71ec" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.635163 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.637902 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.640177 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.640202 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41b2b49f-acfe-4019-a983-c9cea9de4378","Type":"ContainerDied","Data":"2f193c658c700f19047e1e8a00dff6ea721e5995a14485b02f0e9deffbe95aec"} Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.668846 4823 scope.go:117] "RemoveContainer" containerID="0a550f17b17ac9b38481c402ef52c613adb84141592a677e22ded679a0bbc92f" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.688367 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.704275 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.716187 4823 scope.go:117] "RemoveContainer" containerID="20b846e2d51c83fa289005416400c0a2d9d523ade2f29770a1d2db4efcf5961c" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.736495 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761219 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761582 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-notification-agent" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761598 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-notification-agent" Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761608 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="proxy-httpd" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761615 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="proxy-httpd" Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761629 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-log" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761636 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-log" Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761649 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdc8199-cc99-47c0-a271-7653cf92832e" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761655 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdc8199-cc99-47c0-a271-7653cf92832e" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761677 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="sg-core" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761683 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="sg-core" Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761690 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-central-agent" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761696 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-central-agent" Dec 16 07:21:03 crc kubenswrapper[4823]: E1216 07:21:03.761707 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-metadata" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761714 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-metadata" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761876 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-central-agent" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761888 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="sg-core" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761900 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdc8199-cc99-47c0-a271-7653cf92832e" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761907 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-metadata" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761916 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="ceilometer-notification-agent" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761927 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" containerName="nova-metadata-log" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.761937 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" containerName="proxy-httpd" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.762580 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.767917 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.768377 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.768706 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.773007 4823 scope.go:117] "RemoveContainer" containerID="12cdd53f57280c98da4f4f1685a71705c5a0d3d19818af557a48834817456ea8" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.802846 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdc8199-cc99-47c0-a271-7653cf92832e" path="/var/lib/kubelet/pods/3bdc8199-cc99-47c0-a271-7653cf92832e/volumes" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.803741 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.803896 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.806946 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.814427 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.818008 4823 scope.go:117] "RemoveContainer" containerID="dc3495df65d03d20434814e4df5e7c4ba019d019526109dd9bc388dad5bacca8" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.826612 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.835523 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.839214 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.839923 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.840382 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.843321 4823 scope.go:117] "RemoveContainer" containerID="05a5b9f8956ec4b6d4cff55811bfb65a933e6db7a0295b8f7c4bc94832dccb2b" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.849541 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.851896 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.858164 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.858257 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.858428 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.860906 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941039 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941364 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f8g\" (UniqueName: \"kubernetes.io/projected/dbb285b0-26ce-494d-9d69-8fe905e39469-kube-api-access-w5f8g\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941389 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941418 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941482 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941517 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n6ld\" (UniqueName: \"kubernetes.io/projected/d3c5cbc0-9fd6-4c23-8837-847571047381-kube-api-access-4n6ld\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941539 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941630 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941660 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941679 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-run-httpd\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941703 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-config-data\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.941956 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.942200 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-logs\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.942353 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64cz\" (UniqueName: \"kubernetes.io/projected/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-kube-api-access-z64cz\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.942403 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.942454 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-config-data\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.942650 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-log-httpd\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:03 crc kubenswrapper[4823]: I1216 07:21:03.942771 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-scripts\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.044775 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-config-data\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045710 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045764 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-logs\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045807 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64cz\" (UniqueName: \"kubernetes.io/projected/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-kube-api-access-z64cz\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045830 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045849 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-config-data\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045885 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-log-httpd\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045911 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-scripts\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045936 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045959 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f8g\" (UniqueName: \"kubernetes.io/projected/dbb285b0-26ce-494d-9d69-8fe905e39469-kube-api-access-w5f8g\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045978 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.045999 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.046047 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.046070 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n6ld\" (UniqueName: \"kubernetes.io/projected/d3c5cbc0-9fd6-4c23-8837-847571047381-kube-api-access-4n6ld\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.046238 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.046283 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.046585 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.046622 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-run-httpd\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.047084 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-log-httpd\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.047227 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-run-httpd\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.047393 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-logs\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.051671 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.052347 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-config-data\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.053168 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.054458 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.054516 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.056273 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-config-data\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.057243 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.057945 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.058412 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.059356 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.068523 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-scripts\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.068653 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64cz\" (UniqueName: \"kubernetes.io/projected/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-kube-api-access-z64cz\") pod \"nova-metadata-0\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.068717 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f8g\" (UniqueName: \"kubernetes.io/projected/dbb285b0-26ce-494d-9d69-8fe905e39469-kube-api-access-w5f8g\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.080401 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.081535 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n6ld\" (UniqueName: \"kubernetes.io/projected/d3c5cbc0-9fd6-4c23-8837-847571047381-kube-api-access-4n6ld\") pod \"ceilometer-0\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.083364 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.159450 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.181090 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.560162 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.665912 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbb285b0-26ce-494d-9d69-8fe905e39469","Type":"ContainerStarted","Data":"b599852651b94ae9867c04811d750013da43c36522f96160d7a3e0a15baad0ab"} Dec 16 07:21:04 crc kubenswrapper[4823]: W1216 07:21:04.737677 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3c5cbc0_9fd6_4c23_8837_847571047381.slice/crio-86dc605d5320b0cf7dece0664ba254d367fc4250174be36a4a55984e4cbb6cb2 WatchSource:0}: Error finding container 86dc605d5320b0cf7dece0664ba254d367fc4250174be36a4a55984e4cbb6cb2: Status 404 returned error can't find the container with id 86dc605d5320b0cf7dece0664ba254d367fc4250174be36a4a55984e4cbb6cb2 Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.747163 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:04 crc kubenswrapper[4823]: W1216 07:21:04.754934 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb89e60a8_db7c_4e57_ac6b_aab09909e7ad.slice/crio-4e4d5dd17d1ecff3f0e2e6c39ba3e503ce1648ac35eb89efe9517e0b2056f216 WatchSource:0}: Error finding container 4e4d5dd17d1ecff3f0e2e6c39ba3e503ce1648ac35eb89efe9517e0b2056f216: Status 404 returned error can't find the container with id 4e4d5dd17d1ecff3f0e2e6c39ba3e503ce1648ac35eb89efe9517e0b2056f216 Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.757859 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.856835 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.857637 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.857870 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:21:04 crc kubenswrapper[4823]: I1216 07:21:04.860891 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.676157 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbb285b0-26ce-494d-9d69-8fe905e39469","Type":"ContainerStarted","Data":"dcb6ee461f8c315b99af0cef59bee6ad1bc80844d030f1935cac757ed7544094"} Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.677760 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerStarted","Data":"63b2b4c9726db8bccf13e10770be4dd99f89d0be67786afc82c63b099d5cac82"} Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.677794 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerStarted","Data":"86dc605d5320b0cf7dece0664ba254d367fc4250174be36a4a55984e4cbb6cb2"} Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.679782 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b89e60a8-db7c-4e57-ac6b-aab09909e7ad","Type":"ContainerStarted","Data":"c62da01b313746c6d90969373bce73cebad0d872877bc5e5c836f8ce324d682f"} Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.679813 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b89e60a8-db7c-4e57-ac6b-aab09909e7ad","Type":"ContainerStarted","Data":"01c19397046d515a2341974fd6714585936ff05125e45b62837c4191dd2de677"} Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.679826 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b89e60a8-db7c-4e57-ac6b-aab09909e7ad","Type":"ContainerStarted","Data":"4e4d5dd17d1ecff3f0e2e6c39ba3e503ce1648ac35eb89efe9517e0b2056f216"} Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.680057 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.686311 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.696074 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.696055456 podStartE2EDuration="2.696055456s" podCreationTimestamp="2025-12-16 07:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:05.691280156 +0000 UTC m=+1544.179846289" watchObservedRunningTime="2025-12-16 07:21:05.696055456 +0000 UTC m=+1544.184621569" Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.738353 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.73832874 podStartE2EDuration="2.73832874s" podCreationTimestamp="2025-12-16 07:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:05.72906103 +0000 UTC m=+1544.217627433" watchObservedRunningTime="2025-12-16 07:21:05.73832874 +0000 UTC m=+1544.226894863" Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.815322 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f31fcf-de71-43dc-a94c-11ef850cf1e4" path="/var/lib/kubelet/pods/05f31fcf-de71-43dc-a94c-11ef850cf1e4/volumes" Dec 16 07:21:05 crc kubenswrapper[4823]: I1216 07:21:05.816103 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b2b49f-acfe-4019-a983-c9cea9de4378" path="/var/lib/kubelet/pods/41b2b49f-acfe-4019-a983-c9cea9de4378/volumes" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.035580 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-l8nbv"] Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.052211 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.063569 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-l8nbv"] Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.195075 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-config\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.195394 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.195438 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmkg\" (UniqueName: \"kubernetes.io/projected/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-kube-api-access-4vmkg\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.195545 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.195582 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.195784 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.297364 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.297438 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-config\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.297465 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.297491 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmkg\" (UniqueName: \"kubernetes.io/projected/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-kube-api-access-4vmkg\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.297547 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.297579 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.298310 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.298600 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.298809 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.298946 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-config\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.298981 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.319730 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmkg\" (UniqueName: \"kubernetes.io/projected/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-kube-api-access-4vmkg\") pod \"dnsmasq-dns-fcd6f8f8f-l8nbv\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.379975 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.889375 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-l8nbv"] Dec 16 07:21:06 crc kubenswrapper[4823]: W1216 07:21:06.896144 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4795acd_bc9b_4c2c_aaa2_feb41c3c491f.slice/crio-5ef6f083b75b8234ee95ab33fb173e5b23c5618094d3c17a0fd61db492a224b3 WatchSource:0}: Error finding container 5ef6f083b75b8234ee95ab33fb173e5b23c5618094d3c17a0fd61db492a224b3: Status 404 returned error can't find the container with id 5ef6f083b75b8234ee95ab33fb173e5b23c5618094d3c17a0fd61db492a224b3 Dec 16 07:21:06 crc kubenswrapper[4823]: I1216 07:21:06.981435 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.179914 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nbkzm"] Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.185117 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.223083 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbkzm"] Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.323448 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-utilities\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.323587 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznv8\" (UniqueName: \"kubernetes.io/projected/f3c3ac16-0b4e-4828-a690-d740851a5ede-kube-api-access-gznv8\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.323621 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-catalog-content\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.424860 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-utilities\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.425209 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-catalog-content\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.425230 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznv8\" (UniqueName: \"kubernetes.io/projected/f3c3ac16-0b4e-4828-a690-d740851a5ede-kube-api-access-gznv8\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.425567 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-catalog-content\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.425746 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-utilities\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.448472 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznv8\" (UniqueName: \"kubernetes.io/projected/f3c3ac16-0b4e-4828-a690-d740851a5ede-kube-api-access-gznv8\") pod \"redhat-marketplace-nbkzm\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.510914 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.717237 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerStarted","Data":"2db5ed303f81f27607275d887547c0c5a26a3dab90193c23d7bd8f2f006b2438"} Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.717581 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerStarted","Data":"8864e3445a28302e064ecf862d77ff42768e3f4454468b006beac2489abd068f"} Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.734754 4823 generic.go:334] "Generic (PLEG): container finished" podID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerID="bbb61b83c03517ab496d3469eca7132d7dd7639ebb3875043aeecd6b0de352ca" exitCode=0 Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.735629 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" event={"ID":"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f","Type":"ContainerDied","Data":"bbb61b83c03517ab496d3469eca7132d7dd7639ebb3875043aeecd6b0de352ca"} Dec 16 07:21:07 crc kubenswrapper[4823]: I1216 07:21:07.735659 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" event={"ID":"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f","Type":"ContainerStarted","Data":"5ef6f083b75b8234ee95ab33fb173e5b23c5618094d3c17a0fd61db492a224b3"} Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.079116 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbkzm"] Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.750553 4823 generic.go:334] "Generic (PLEG): container finished" podID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerID="cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4" exitCode=0 Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.750659 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbkzm" event={"ID":"f3c3ac16-0b4e-4828-a690-d740851a5ede","Type":"ContainerDied","Data":"cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4"} Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.750854 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbkzm" event={"ID":"f3c3ac16-0b4e-4828-a690-d740851a5ede","Type":"ContainerStarted","Data":"00d6d4c03efb48006ad6e3c15109b45e1b152125db3d7753443c64c2e7c980ad"} Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.757376 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" event={"ID":"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f","Type":"ContainerStarted","Data":"14f9aa7c5d7c0e6bdf53c979b009b546f44e6652421ca6154616d807431fa6e2"} Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.757584 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:08 crc kubenswrapper[4823]: I1216 07:21:08.829597 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" podStartSLOduration=3.829571738 podStartE2EDuration="3.829571738s" podCreationTimestamp="2025-12-16 07:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:08.815467536 +0000 UTC m=+1547.304033679" watchObservedRunningTime="2025-12-16 07:21:08.829571738 +0000 UTC m=+1547.318137861" Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.084655 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.112680 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.161399 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.161441 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.461608 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.461835 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-log" containerID="cri-o://7077901ca745fda9f1511b3ad5a0907f42f5d61f05e06175d316699713e8a2e8" gracePeriod=30 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.461936 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-api" containerID="cri-o://898f21005519e2d3aa8dec31a7bf05da98a85c82a0b9743d0195cdf75205f2a5" gracePeriod=30 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.771103 4823 generic.go:334] "Generic (PLEG): container finished" podID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerID="7077901ca745fda9f1511b3ad5a0907f42f5d61f05e06175d316699713e8a2e8" exitCode=143 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.771552 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:21:09 crc kubenswrapper[4823]: E1216 07:21:09.771798 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.775341 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-central-agent" containerID="cri-o://63b2b4c9726db8bccf13e10770be4dd99f89d0be67786afc82c63b099d5cac82" gracePeriod=30 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.775416 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-notification-agent" containerID="cri-o://2db5ed303f81f27607275d887547c0c5a26a3dab90193c23d7bd8f2f006b2438" gracePeriod=30 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.775443 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="sg-core" containerID="cri-o://8864e3445a28302e064ecf862d77ff42768e3f4454468b006beac2489abd068f" gracePeriod=30 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.775484 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="proxy-httpd" containerID="cri-o://02d6ce908322bb130482f39ef6c9936af2a30bfa263eb15901a484280b9f2d6d" gracePeriod=30 Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.786936 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.786961 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d20d7e4d-f319-43f8-bf31-87b114fa7517","Type":"ContainerDied","Data":"7077901ca745fda9f1511b3ad5a0907f42f5d61f05e06175d316699713e8a2e8"} Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.786979 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerStarted","Data":"02d6ce908322bb130482f39ef6c9936af2a30bfa263eb15901a484280b9f2d6d"} Dec 16 07:21:09 crc kubenswrapper[4823]: I1216 07:21:09.798735 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.542058482 podStartE2EDuration="6.798716012s" podCreationTimestamp="2025-12-16 07:21:03 +0000 UTC" firstStartedPulling="2025-12-16 07:21:04.742397307 +0000 UTC m=+1543.230963430" lastFinishedPulling="2025-12-16 07:21:08.999054837 +0000 UTC m=+1547.487620960" observedRunningTime="2025-12-16 07:21:09.795571884 +0000 UTC m=+1548.284138007" watchObservedRunningTime="2025-12-16 07:21:09.798716012 +0000 UTC m=+1548.287282135" Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.801980 4823 generic.go:334] "Generic (PLEG): container finished" podID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerID="b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057" exitCode=0 Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.802039 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbkzm" event={"ID":"f3c3ac16-0b4e-4828-a690-d740851a5ede","Type":"ContainerDied","Data":"b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057"} Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.809796 4823 generic.go:334] "Generic (PLEG): container finished" podID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerID="02d6ce908322bb130482f39ef6c9936af2a30bfa263eb15901a484280b9f2d6d" exitCode=0 Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.809841 4823 generic.go:334] "Generic (PLEG): container finished" podID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerID="8864e3445a28302e064ecf862d77ff42768e3f4454468b006beac2489abd068f" exitCode=2 Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.809855 4823 generic.go:334] "Generic (PLEG): container finished" podID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerID="2db5ed303f81f27607275d887547c0c5a26a3dab90193c23d7bd8f2f006b2438" exitCode=0 Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.809845 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerDied","Data":"02d6ce908322bb130482f39ef6c9936af2a30bfa263eb15901a484280b9f2d6d"} Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.809904 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerDied","Data":"8864e3445a28302e064ecf862d77ff42768e3f4454468b006beac2489abd068f"} Dec 16 07:21:10 crc kubenswrapper[4823]: I1216 07:21:10.809925 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerDied","Data":"2db5ed303f81f27607275d887547c0c5a26a3dab90193c23d7bd8f2f006b2438"} Dec 16 07:21:11 crc kubenswrapper[4823]: I1216 07:21:11.821779 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbkzm" event={"ID":"f3c3ac16-0b4e-4828-a690-d740851a5ede","Type":"ContainerStarted","Data":"3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb"} Dec 16 07:21:11 crc kubenswrapper[4823]: I1216 07:21:11.858367 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nbkzm" podStartSLOduration=2.311934936 podStartE2EDuration="4.85833905s" podCreationTimestamp="2025-12-16 07:21:07 +0000 UTC" firstStartedPulling="2025-12-16 07:21:08.752764223 +0000 UTC m=+1547.241330346" lastFinishedPulling="2025-12-16 07:21:11.299168337 +0000 UTC m=+1549.787734460" observedRunningTime="2025-12-16 07:21:11.837504508 +0000 UTC m=+1550.326070651" watchObservedRunningTime="2025-12-16 07:21:11.85833905 +0000 UTC m=+1550.346905163" Dec 16 07:21:12 crc kubenswrapper[4823]: I1216 07:21:12.856521 4823 generic.go:334] "Generic (PLEG): container finished" podID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerID="898f21005519e2d3aa8dec31a7bf05da98a85c82a0b9743d0195cdf75205f2a5" exitCode=0 Dec 16 07:21:12 crc kubenswrapper[4823]: I1216 07:21:12.859057 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d20d7e4d-f319-43f8-bf31-87b114fa7517","Type":"ContainerDied","Data":"898f21005519e2d3aa8dec31a7bf05da98a85c82a0b9743d0195cdf75205f2a5"} Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.130202 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.253816 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxxdl\" (UniqueName: \"kubernetes.io/projected/d20d7e4d-f319-43f8-bf31-87b114fa7517-kube-api-access-fxxdl\") pod \"d20d7e4d-f319-43f8-bf31-87b114fa7517\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.253959 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-config-data\") pod \"d20d7e4d-f319-43f8-bf31-87b114fa7517\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.254136 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-combined-ca-bundle\") pod \"d20d7e4d-f319-43f8-bf31-87b114fa7517\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.254212 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d7e4d-f319-43f8-bf31-87b114fa7517-logs\") pod \"d20d7e4d-f319-43f8-bf31-87b114fa7517\" (UID: \"d20d7e4d-f319-43f8-bf31-87b114fa7517\") " Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.254713 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20d7e4d-f319-43f8-bf31-87b114fa7517-logs" (OuterVolumeSpecName: "logs") pod "d20d7e4d-f319-43f8-bf31-87b114fa7517" (UID: "d20d7e4d-f319-43f8-bf31-87b114fa7517"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.260412 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20d7e4d-f319-43f8-bf31-87b114fa7517-kube-api-access-fxxdl" (OuterVolumeSpecName: "kube-api-access-fxxdl") pod "d20d7e4d-f319-43f8-bf31-87b114fa7517" (UID: "d20d7e4d-f319-43f8-bf31-87b114fa7517"). InnerVolumeSpecName "kube-api-access-fxxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.290654 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-config-data" (OuterVolumeSpecName: "config-data") pod "d20d7e4d-f319-43f8-bf31-87b114fa7517" (UID: "d20d7e4d-f319-43f8-bf31-87b114fa7517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.294482 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d20d7e4d-f319-43f8-bf31-87b114fa7517" (UID: "d20d7e4d-f319-43f8-bf31-87b114fa7517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.356690 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d20d7e4d-f319-43f8-bf31-87b114fa7517-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.356730 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxxdl\" (UniqueName: \"kubernetes.io/projected/d20d7e4d-f319-43f8-bf31-87b114fa7517-kube-api-access-fxxdl\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.356742 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.356752 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20d7e4d-f319-43f8-bf31-87b114fa7517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.873367 4823 generic.go:334] "Generic (PLEG): container finished" podID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerID="63b2b4c9726db8bccf13e10770be4dd99f89d0be67786afc82c63b099d5cac82" exitCode=0 Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.873424 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerDied","Data":"63b2b4c9726db8bccf13e10770be4dd99f89d0be67786afc82c63b099d5cac82"} Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.875784 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d20d7e4d-f319-43f8-bf31-87b114fa7517","Type":"ContainerDied","Data":"ef012e164506db665546b697cf2a092893ebdf336790a65fb79e95034bf3039b"} Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.875820 4823 scope.go:117] "RemoveContainer" containerID="898f21005519e2d3aa8dec31a7bf05da98a85c82a0b9743d0195cdf75205f2a5" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.875966 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.901101 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.908832 4823 scope.go:117] "RemoveContainer" containerID="7077901ca745fda9f1511b3ad5a0907f42f5d61f05e06175d316699713e8a2e8" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.937830 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.965201 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:13 crc kubenswrapper[4823]: E1216 07:21:13.965829 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-api" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.965921 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-api" Dec 16 07:21:13 crc kubenswrapper[4823]: E1216 07:21:13.965996 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-log" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.966137 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-log" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.966460 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-log" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.966535 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" containerName="nova-api-api" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.967595 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.969583 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.969793 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.970006 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 07:21:13 crc kubenswrapper[4823]: I1216 07:21:13.977357 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.020152 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.072712 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.072973 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-public-tls-certs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.073040 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.073073 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-config-data\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.073109 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qcp\" (UniqueName: \"kubernetes.io/projected/541a384f-aa52-43ce-b85d-e3475ad49acd-kube-api-access-z9qcp\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.073187 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541a384f-aa52-43ce-b85d-e3475ad49acd-logs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.085305 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.103621 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.161235 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.161309 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.174855 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-sg-core-conf-yaml\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.174896 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-ceilometer-tls-certs\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.174933 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-config-data\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175073 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-log-httpd\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175115 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-combined-ca-bundle\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175235 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-scripts\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175289 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n6ld\" (UniqueName: \"kubernetes.io/projected/d3c5cbc0-9fd6-4c23-8837-847571047381-kube-api-access-4n6ld\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175312 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-run-httpd\") pod \"d3c5cbc0-9fd6-4c23-8837-847571047381\" (UID: \"d3c5cbc0-9fd6-4c23-8837-847571047381\") " Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175540 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175604 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-config-data\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175643 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qcp\" (UniqueName: \"kubernetes.io/projected/541a384f-aa52-43ce-b85d-e3475ad49acd-kube-api-access-z9qcp\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175732 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541a384f-aa52-43ce-b85d-e3475ad49acd-logs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175840 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175877 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175912 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-public-tls-certs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.175993 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.178107 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541a384f-aa52-43ce-b85d-e3475ad49acd-logs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.180160 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.181337 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-scripts" (OuterVolumeSpecName: "scripts") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.184527 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.185451 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-public-tls-certs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.185601 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c5cbc0-9fd6-4c23-8837-847571047381-kube-api-access-4n6ld" (OuterVolumeSpecName: "kube-api-access-4n6ld") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "kube-api-access-4n6ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.185691 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.187261 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-config-data\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.197249 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qcp\" (UniqueName: \"kubernetes.io/projected/541a384f-aa52-43ce-b85d-e3475ad49acd-kube-api-access-z9qcp\") pod \"nova-api-0\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.209319 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.246223 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.270719 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.277962 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.277990 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.278000 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n6ld\" (UniqueName: \"kubernetes.io/projected/d3c5cbc0-9fd6-4c23-8837-847571047381-kube-api-access-4n6ld\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.278009 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3c5cbc0-9fd6-4c23-8837-847571047381-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.278018 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.278037 4823 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.311251 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-config-data" (OuterVolumeSpecName: "config-data") pod "d3c5cbc0-9fd6-4c23-8837-847571047381" (UID: "d3c5cbc0-9fd6-4c23-8837-847571047381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.331445 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.379340 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c5cbc0-9fd6-4c23-8837-847571047381-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.860202 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.888189 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"541a384f-aa52-43ce-b85d-e3475ad49acd","Type":"ContainerStarted","Data":"c28e8b1a07ab723bc979477bd9ef0baa555d24c7ee70346a0319372fd90c4362"} Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.898482 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.899611 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3c5cbc0-9fd6-4c23-8837-847571047381","Type":"ContainerDied","Data":"86dc605d5320b0cf7dece0664ba254d367fc4250174be36a4a55984e4cbb6cb2"} Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.899681 4823 scope.go:117] "RemoveContainer" containerID="02d6ce908322bb130482f39ef6c9936af2a30bfa263eb15901a484280b9f2d6d" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.916217 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:21:14 crc kubenswrapper[4823]: I1216 07:21:14.973076 4823 scope.go:117] "RemoveContainer" containerID="8864e3445a28302e064ecf862d77ff42768e3f4454468b006beac2489abd068f" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.009308 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.020138 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.032673 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:15 crc kubenswrapper[4823]: E1216 07:21:15.033204 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-notification-agent" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033223 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-notification-agent" Dec 16 07:21:15 crc kubenswrapper[4823]: E1216 07:21:15.033243 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-central-agent" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033250 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-central-agent" Dec 16 07:21:15 crc kubenswrapper[4823]: E1216 07:21:15.033263 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="proxy-httpd" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033271 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="proxy-httpd" Dec 16 07:21:15 crc kubenswrapper[4823]: E1216 07:21:15.033285 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="sg-core" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033292 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="sg-core" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033512 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-notification-agent" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033529 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="proxy-httpd" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033542 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="sg-core" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.033556 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" containerName="ceilometer-central-agent" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.035937 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.039387 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.039855 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.039872 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.047325 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.069039 4823 scope.go:117] "RemoveContainer" containerID="2db5ed303f81f27607275d887547c0c5a26a3dab90193c23d7bd8f2f006b2438" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.095523 4823 scope.go:117] "RemoveContainer" containerID="63b2b4c9726db8bccf13e10770be4dd99f89d0be67786afc82c63b099d5cac82" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100127 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-log-httpd\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100343 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-scripts\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100547 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-config-data\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100604 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100680 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljcbm\" (UniqueName: \"kubernetes.io/projected/77e933eb-7294-47b8-af0c-fbb03725d3d8-kube-api-access-ljcbm\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100719 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100774 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.100824 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-run-httpd\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.176220 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.176724 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204340 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-log-httpd\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204456 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-scripts\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204497 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-config-data\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204514 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204541 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljcbm\" (UniqueName: \"kubernetes.io/projected/77e933eb-7294-47b8-af0c-fbb03725d3d8-kube-api-access-ljcbm\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204563 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204584 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.204612 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-run-httpd\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.205035 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-run-httpd\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.205246 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-log-httpd\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.210943 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.217078 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vvf8j"] Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.217952 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.218356 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.218761 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-scripts\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.220270 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.226660 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.226671 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.228505 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-config-data\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.240234 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvf8j"] Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.252094 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljcbm\" (UniqueName: \"kubernetes.io/projected/77e933eb-7294-47b8-af0c-fbb03725d3d8-kube-api-access-ljcbm\") pod \"ceilometer-0\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.307447 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.307531 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6vg\" (UniqueName: \"kubernetes.io/projected/6b6c804c-4a6b-4061-95f4-9a8c96167f76-kube-api-access-7j6vg\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.307598 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-config-data\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.307672 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-scripts\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.371356 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.409611 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-scripts\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.409863 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.409900 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6vg\" (UniqueName: \"kubernetes.io/projected/6b6c804c-4a6b-4061-95f4-9a8c96167f76-kube-api-access-7j6vg\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.409938 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-config-data\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.414853 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-scripts\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.415112 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-config-data\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.415296 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.431408 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6vg\" (UniqueName: \"kubernetes.io/projected/6b6c804c-4a6b-4061-95f4-9a8c96167f76-kube-api-access-7j6vg\") pod \"nova-cell1-cell-mapping-vvf8j\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.600738 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.791196 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20d7e4d-f319-43f8-bf31-87b114fa7517" path="/var/lib/kubelet/pods/d20d7e4d-f319-43f8-bf31-87b114fa7517/volumes" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.792325 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c5cbc0-9fd6-4c23-8837-847571047381" path="/var/lib/kubelet/pods/d3c5cbc0-9fd6-4c23-8837-847571047381/volumes" Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.827202 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.912220 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerStarted","Data":"d1eb8faa55599142c8d064c3cde3f9380cfdbbedc01cb4c2b32d4809cdfc6ff7"} Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.917489 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"541a384f-aa52-43ce-b85d-e3475ad49acd","Type":"ContainerStarted","Data":"dfe650b69a33382b4e88ea0e1ea3dcf0cdeeeca7466deeb1ea4883e6de434868"} Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.917567 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"541a384f-aa52-43ce-b85d-e3475ad49acd","Type":"ContainerStarted","Data":"368d9aec36767fcc7a48f7f97635938a1dba7d4d72dfda2a3c5838680ef2c087"} Dec 16 07:21:15 crc kubenswrapper[4823]: I1216 07:21:15.957547 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.957488106 podStartE2EDuration="2.957488106s" podCreationTimestamp="2025-12-16 07:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:15.949271339 +0000 UTC m=+1554.437837542" watchObservedRunningTime="2025-12-16 07:21:15.957488106 +0000 UTC m=+1554.446054239" Dec 16 07:21:16 crc kubenswrapper[4823]: W1216 07:21:16.037771 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b6c804c_4a6b_4061_95f4_9a8c96167f76.slice/crio-a6d8fd6802f304ad949608a80d6a22f469dd4491288ed083aa3c043a8d39e5e8 WatchSource:0}: Error finding container a6d8fd6802f304ad949608a80d6a22f469dd4491288ed083aa3c043a8d39e5e8: Status 404 returned error can't find the container with id a6d8fd6802f304ad949608a80d6a22f469dd4491288ed083aa3c043a8d39e5e8 Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.039585 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvf8j"] Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.381225 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.447137 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-lfdcp"] Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.447430 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="dnsmasq-dns" containerID="cri-o://f5027690ff5170201f3250e3d31dfbd810fe793d136225ccf387795cb8773c20" gracePeriod=10 Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.632008 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: connect: connection refused" Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.926183 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerStarted","Data":"fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c"} Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.928346 4823 generic.go:334] "Generic (PLEG): container finished" podID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerID="f5027690ff5170201f3250e3d31dfbd810fe793d136225ccf387795cb8773c20" exitCode=0 Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.928382 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" event={"ID":"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa","Type":"ContainerDied","Data":"f5027690ff5170201f3250e3d31dfbd810fe793d136225ccf387795cb8773c20"} Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.932584 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvf8j" event={"ID":"6b6c804c-4a6b-4061-95f4-9a8c96167f76","Type":"ContainerStarted","Data":"3f783c5a727d05736908b2e0ecd933e245b110228a42ec9667cdc218c6a08477"} Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.932632 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvf8j" event={"ID":"6b6c804c-4a6b-4061-95f4-9a8c96167f76","Type":"ContainerStarted","Data":"a6d8fd6802f304ad949608a80d6a22f469dd4491288ed083aa3c043a8d39e5e8"} Dec 16 07:21:16 crc kubenswrapper[4823]: I1216 07:21:16.950894 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vvf8j" podStartSLOduration=1.950879739 podStartE2EDuration="1.950879739s" podCreationTimestamp="2025-12-16 07:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:16.948537526 +0000 UTC m=+1555.437103649" watchObservedRunningTime="2025-12-16 07:21:16.950879739 +0000 UTC m=+1555.439445862" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.091883 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.148122 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-config\") pod \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.148569 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm4h2\" (UniqueName: \"kubernetes.io/projected/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-kube-api-access-bm4h2\") pod \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.148721 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-sb\") pod \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.148807 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-svc\") pod \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.148829 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-swift-storage-0\") pod \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.148865 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-nb\") pod \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\" (UID: \"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa\") " Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.173550 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-kube-api-access-bm4h2" (OuterVolumeSpecName: "kube-api-access-bm4h2") pod "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" (UID: "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa"). InnerVolumeSpecName "kube-api-access-bm4h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.205223 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-config" (OuterVolumeSpecName: "config") pod "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" (UID: "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.236258 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" (UID: "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.239871 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" (UID: "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.243480 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" (UID: "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.245575 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" (UID: "d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.250638 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.250670 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.250681 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.250692 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.250700 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm4h2\" (UniqueName: \"kubernetes.io/projected/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-kube-api-access-bm4h2\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.250713 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.511457 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.523269 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.582647 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.941962 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerStarted","Data":"c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294"} Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.945081 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" event={"ID":"d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa","Type":"ContainerDied","Data":"b2476c086cbe3a7d8ce3a52a1933a4ebdbeccda9029a293fbd426eec90fdff8e"} Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.945153 4823 scope.go:117] "RemoveContainer" containerID="f5027690ff5170201f3250e3d31dfbd810fe793d136225ccf387795cb8773c20" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.945201 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-lfdcp" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.973431 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-lfdcp"] Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.979302 4823 scope.go:117] "RemoveContainer" containerID="d0c12acae64a11345b532f0613a2afb372cb1aeb0df6c6420c1b6b7634f2ade4" Dec 16 07:21:17 crc kubenswrapper[4823]: I1216 07:21:17.984310 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-lfdcp"] Dec 16 07:21:18 crc kubenswrapper[4823]: I1216 07:21:18.014432 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:18 crc kubenswrapper[4823]: I1216 07:21:18.068768 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbkzm"] Dec 16 07:21:18 crc kubenswrapper[4823]: I1216 07:21:18.955169 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerStarted","Data":"eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813"} Dec 16 07:21:19 crc kubenswrapper[4823]: I1216 07:21:19.783452 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" path="/var/lib/kubelet/pods/d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa/volumes" Dec 16 07:21:19 crc kubenswrapper[4823]: I1216 07:21:19.985864 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nbkzm" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="registry-server" containerID="cri-o://3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb" gracePeriod=2 Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.586415 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.717671 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznv8\" (UniqueName: \"kubernetes.io/projected/f3c3ac16-0b4e-4828-a690-d740851a5ede-kube-api-access-gznv8\") pod \"f3c3ac16-0b4e-4828-a690-d740851a5ede\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.717746 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-catalog-content\") pod \"f3c3ac16-0b4e-4828-a690-d740851a5ede\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.718034 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-utilities\") pod \"f3c3ac16-0b4e-4828-a690-d740851a5ede\" (UID: \"f3c3ac16-0b4e-4828-a690-d740851a5ede\") " Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.718729 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-utilities" (OuterVolumeSpecName: "utilities") pod "f3c3ac16-0b4e-4828-a690-d740851a5ede" (UID: "f3c3ac16-0b4e-4828-a690-d740851a5ede"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.723554 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c3ac16-0b4e-4828-a690-d740851a5ede-kube-api-access-gznv8" (OuterVolumeSpecName: "kube-api-access-gznv8") pod "f3c3ac16-0b4e-4828-a690-d740851a5ede" (UID: "f3c3ac16-0b4e-4828-a690-d740851a5ede"). InnerVolumeSpecName "kube-api-access-gznv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.743139 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3c3ac16-0b4e-4828-a690-d740851a5ede" (UID: "f3c3ac16-0b4e-4828-a690-d740851a5ede"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.820034 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.820068 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznv8\" (UniqueName: \"kubernetes.io/projected/f3c3ac16-0b4e-4828-a690-d740851a5ede-kube-api-access-gznv8\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:20 crc kubenswrapper[4823]: I1216 07:21:20.820078 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3c3ac16-0b4e-4828-a690-d740851a5ede-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.005204 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerStarted","Data":"42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214"} Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.005381 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.013406 4823 generic.go:334] "Generic (PLEG): container finished" podID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerID="3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb" exitCode=0 Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.013486 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbkzm" event={"ID":"f3c3ac16-0b4e-4828-a690-d740851a5ede","Type":"ContainerDied","Data":"3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb"} Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.013553 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbkzm" event={"ID":"f3c3ac16-0b4e-4828-a690-d740851a5ede","Type":"ContainerDied","Data":"00d6d4c03efb48006ad6e3c15109b45e1b152125db3d7753443c64c2e7c980ad"} Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.013582 4823 scope.go:117] "RemoveContainer" containerID="3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.013757 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbkzm" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.040694 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.897942141 podStartE2EDuration="7.040676382s" podCreationTimestamp="2025-12-16 07:21:14 +0000 UTC" firstStartedPulling="2025-12-16 07:21:15.845124017 +0000 UTC m=+1554.333690160" lastFinishedPulling="2025-12-16 07:21:19.987858268 +0000 UTC m=+1558.476424401" observedRunningTime="2025-12-16 07:21:21.039487925 +0000 UTC m=+1559.528054068" watchObservedRunningTime="2025-12-16 07:21:21.040676382 +0000 UTC m=+1559.529242505" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.068935 4823 scope.go:117] "RemoveContainer" containerID="b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.108600 4823 scope.go:117] "RemoveContainer" containerID="cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.110498 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbkzm"] Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.120389 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbkzm"] Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.144916 4823 scope.go:117] "RemoveContainer" containerID="3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb" Dec 16 07:21:21 crc kubenswrapper[4823]: E1216 07:21:21.145817 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb\": container with ID starting with 3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb not found: ID does not exist" containerID="3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.145849 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb"} err="failed to get container status \"3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb\": rpc error: code = NotFound desc = could not find container \"3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb\": container with ID starting with 3ee47c044cad56d8a1c60f1e13ea55ebcb0648fb880d3fb0ef6ecc7bff1e91fb not found: ID does not exist" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.145872 4823 scope.go:117] "RemoveContainer" containerID="b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057" Dec 16 07:21:21 crc kubenswrapper[4823]: E1216 07:21:21.146252 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057\": container with ID starting with b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057 not found: ID does not exist" containerID="b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.146309 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057"} err="failed to get container status \"b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057\": rpc error: code = NotFound desc = could not find container \"b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057\": container with ID starting with b0c08a304ae44e404b62d95b3f8e0782bf668f61838491bceabd461a64ff5057 not found: ID does not exist" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.146340 4823 scope.go:117] "RemoveContainer" containerID="cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4" Dec 16 07:21:21 crc kubenswrapper[4823]: E1216 07:21:21.146779 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4\": container with ID starting with cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4 not found: ID does not exist" containerID="cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.146806 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4"} err="failed to get container status \"cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4\": rpc error: code = NotFound desc = could not find container \"cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4\": container with ID starting with cf8f734bce5534b0f96ac6bc74e7e9dd0f06c78f5c8679805b3d3ee551f026a4 not found: ID does not exist" Dec 16 07:21:21 crc kubenswrapper[4823]: I1216 07:21:21.785594 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" path="/var/lib/kubelet/pods/f3c3ac16-0b4e-4828-a690-d740851a5ede/volumes" Dec 16 07:21:22 crc kubenswrapper[4823]: I1216 07:21:22.031977 4823 generic.go:334] "Generic (PLEG): container finished" podID="6b6c804c-4a6b-4061-95f4-9a8c96167f76" containerID="3f783c5a727d05736908b2e0ecd933e245b110228a42ec9667cdc218c6a08477" exitCode=0 Dec 16 07:21:22 crc kubenswrapper[4823]: I1216 07:21:22.032105 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvf8j" event={"ID":"6b6c804c-4a6b-4061-95f4-9a8c96167f76","Type":"ContainerDied","Data":"3f783c5a727d05736908b2e0ecd933e245b110228a42ec9667cdc218c6a08477"} Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.407637 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.480165 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j6vg\" (UniqueName: \"kubernetes.io/projected/6b6c804c-4a6b-4061-95f4-9a8c96167f76-kube-api-access-7j6vg\") pod \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.480339 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-scripts\") pod \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.480372 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-config-data\") pod \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.480394 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-combined-ca-bundle\") pod \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\" (UID: \"6b6c804c-4a6b-4061-95f4-9a8c96167f76\") " Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.486157 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-scripts" (OuterVolumeSpecName: "scripts") pod "6b6c804c-4a6b-4061-95f4-9a8c96167f76" (UID: "6b6c804c-4a6b-4061-95f4-9a8c96167f76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.486246 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6c804c-4a6b-4061-95f4-9a8c96167f76-kube-api-access-7j6vg" (OuterVolumeSpecName: "kube-api-access-7j6vg") pod "6b6c804c-4a6b-4061-95f4-9a8c96167f76" (UID: "6b6c804c-4a6b-4061-95f4-9a8c96167f76"). InnerVolumeSpecName "kube-api-access-7j6vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.506880 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b6c804c-4a6b-4061-95f4-9a8c96167f76" (UID: "6b6c804c-4a6b-4061-95f4-9a8c96167f76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.525003 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-config-data" (OuterVolumeSpecName: "config-data") pod "6b6c804c-4a6b-4061-95f4-9a8c96167f76" (UID: "6b6c804c-4a6b-4061-95f4-9a8c96167f76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.582768 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j6vg\" (UniqueName: \"kubernetes.io/projected/6b6c804c-4a6b-4061-95f4-9a8c96167f76-kube-api-access-7j6vg\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.583013 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.583150 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:23 crc kubenswrapper[4823]: I1216 07:21:23.583244 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6c804c-4a6b-4061-95f4-9a8c96167f76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.051809 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvf8j" event={"ID":"6b6c804c-4a6b-4061-95f4-9a8c96167f76","Type":"ContainerDied","Data":"a6d8fd6802f304ad949608a80d6a22f469dd4491288ed083aa3c043a8d39e5e8"} Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.052387 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d8fd6802f304ad949608a80d6a22f469dd4491288ed083aa3c043a8d39e5e8" Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.051884 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvf8j" Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.297132 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.297378 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-log" containerID="cri-o://368d9aec36767fcc7a48f7f97635938a1dba7d4d72dfda2a3c5838680ef2c087" gracePeriod=30 Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.297464 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-api" containerID="cri-o://dfe650b69a33382b4e88ea0e1ea3dcf0cdeeeca7466deeb1ea4883e6de434868" gracePeriod=30 Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.331489 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.331689 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9c5cc73c-9d84-405e-b093-d6c721a739c8" containerName="nova-scheduler-scheduler" containerID="cri-o://0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733" gracePeriod=30 Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.369951 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.370247 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-log" containerID="cri-o://01c19397046d515a2341974fd6714585936ff05125e45b62837c4191dd2de677" gracePeriod=30 Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.370859 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-metadata" containerID="cri-o://c62da01b313746c6d90969373bce73cebad0d872877bc5e5c836f8ce324d682f" gracePeriod=30 Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.391066 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.401914 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": EOF" Dec 16 07:21:24 crc kubenswrapper[4823]: I1216 07:21:24.771340 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:21:24 crc kubenswrapper[4823]: E1216 07:21:24.771539 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.073329 4823 generic.go:334] "Generic (PLEG): container finished" podID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerID="dfe650b69a33382b4e88ea0e1ea3dcf0cdeeeca7466deeb1ea4883e6de434868" exitCode=0 Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.073649 4823 generic.go:334] "Generic (PLEG): container finished" podID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerID="368d9aec36767fcc7a48f7f97635938a1dba7d4d72dfda2a3c5838680ef2c087" exitCode=143 Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.073394 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"541a384f-aa52-43ce-b85d-e3475ad49acd","Type":"ContainerDied","Data":"dfe650b69a33382b4e88ea0e1ea3dcf0cdeeeca7466deeb1ea4883e6de434868"} Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.073758 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"541a384f-aa52-43ce-b85d-e3475ad49acd","Type":"ContainerDied","Data":"368d9aec36767fcc7a48f7f97635938a1dba7d4d72dfda2a3c5838680ef2c087"} Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.076261 4823 generic.go:334] "Generic (PLEG): container finished" podID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerID="01c19397046d515a2341974fd6714585936ff05125e45b62837c4191dd2de677" exitCode=143 Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.076321 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b89e60a8-db7c-4e57-ac6b-aab09909e7ad","Type":"ContainerDied","Data":"01c19397046d515a2341974fd6714585936ff05125e45b62837c4191dd2de677"} Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.341216 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.440642 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-combined-ca-bundle\") pod \"541a384f-aa52-43ce-b85d-e3475ad49acd\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.440757 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qcp\" (UniqueName: \"kubernetes.io/projected/541a384f-aa52-43ce-b85d-e3475ad49acd-kube-api-access-z9qcp\") pod \"541a384f-aa52-43ce-b85d-e3475ad49acd\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.440931 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541a384f-aa52-43ce-b85d-e3475ad49acd-logs\") pod \"541a384f-aa52-43ce-b85d-e3475ad49acd\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.440963 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-internal-tls-certs\") pod \"541a384f-aa52-43ce-b85d-e3475ad49acd\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.441005 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-public-tls-certs\") pod \"541a384f-aa52-43ce-b85d-e3475ad49acd\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.441051 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-config-data\") pod \"541a384f-aa52-43ce-b85d-e3475ad49acd\" (UID: \"541a384f-aa52-43ce-b85d-e3475ad49acd\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.441611 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541a384f-aa52-43ce-b85d-e3475ad49acd-logs" (OuterVolumeSpecName: "logs") pod "541a384f-aa52-43ce-b85d-e3475ad49acd" (UID: "541a384f-aa52-43ce-b85d-e3475ad49acd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.447283 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541a384f-aa52-43ce-b85d-e3475ad49acd-kube-api-access-z9qcp" (OuterVolumeSpecName: "kube-api-access-z9qcp") pod "541a384f-aa52-43ce-b85d-e3475ad49acd" (UID: "541a384f-aa52-43ce-b85d-e3475ad49acd"). InnerVolumeSpecName "kube-api-access-z9qcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.466163 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-config-data" (OuterVolumeSpecName: "config-data") pod "541a384f-aa52-43ce-b85d-e3475ad49acd" (UID: "541a384f-aa52-43ce-b85d-e3475ad49acd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.477226 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "541a384f-aa52-43ce-b85d-e3475ad49acd" (UID: "541a384f-aa52-43ce-b85d-e3475ad49acd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.494820 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "541a384f-aa52-43ce-b85d-e3475ad49acd" (UID: "541a384f-aa52-43ce-b85d-e3475ad49acd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.497179 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "541a384f-aa52-43ce-b85d-e3475ad49acd" (UID: "541a384f-aa52-43ce-b85d-e3475ad49acd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.542915 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541a384f-aa52-43ce-b85d-e3475ad49acd-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.542953 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.542964 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.542973 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.542986 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541a384f-aa52-43ce-b85d-e3475ad49acd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.542996 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9qcp\" (UniqueName: \"kubernetes.io/projected/541a384f-aa52-43ce-b85d-e3475ad49acd-kube-api-access-z9qcp\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.890651 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.950178 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvvx6\" (UniqueName: \"kubernetes.io/projected/9c5cc73c-9d84-405e-b093-d6c721a739c8-kube-api-access-gvvx6\") pod \"9c5cc73c-9d84-405e-b093-d6c721a739c8\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.950935 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-config-data\") pod \"9c5cc73c-9d84-405e-b093-d6c721a739c8\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.951054 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-combined-ca-bundle\") pod \"9c5cc73c-9d84-405e-b093-d6c721a739c8\" (UID: \"9c5cc73c-9d84-405e-b093-d6c721a739c8\") " Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.956105 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5cc73c-9d84-405e-b093-d6c721a739c8-kube-api-access-gvvx6" (OuterVolumeSpecName: "kube-api-access-gvvx6") pod "9c5cc73c-9d84-405e-b093-d6c721a739c8" (UID: "9c5cc73c-9d84-405e-b093-d6c721a739c8"). InnerVolumeSpecName "kube-api-access-gvvx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.981216 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c5cc73c-9d84-405e-b093-d6c721a739c8" (UID: "9c5cc73c-9d84-405e-b093-d6c721a739c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:25 crc kubenswrapper[4823]: I1216 07:21:25.994195 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-config-data" (OuterVolumeSpecName: "config-data") pod "9c5cc73c-9d84-405e-b093-d6c721a739c8" (UID: "9c5cc73c-9d84-405e-b093-d6c721a739c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.053993 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvvx6\" (UniqueName: \"kubernetes.io/projected/9c5cc73c-9d84-405e-b093-d6c721a739c8-kube-api-access-gvvx6\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.054042 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.054053 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5cc73c-9d84-405e-b093-d6c721a739c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.087891 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"541a384f-aa52-43ce-b85d-e3475ad49acd","Type":"ContainerDied","Data":"c28e8b1a07ab723bc979477bd9ef0baa555d24c7ee70346a0319372fd90c4362"} Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.087937 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.087961 4823 scope.go:117] "RemoveContainer" containerID="dfe650b69a33382b4e88ea0e1ea3dcf0cdeeeca7466deeb1ea4883e6de434868" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.090728 4823 generic.go:334] "Generic (PLEG): container finished" podID="9c5cc73c-9d84-405e-b093-d6c721a739c8" containerID="0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733" exitCode=0 Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.090766 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9c5cc73c-9d84-405e-b093-d6c721a739c8","Type":"ContainerDied","Data":"0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733"} Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.090809 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9c5cc73c-9d84-405e-b093-d6c721a739c8","Type":"ContainerDied","Data":"483327e9f27dc555d169b453eeb0d6919819d632910c977c2e17de8adbaf3e6e"} Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.090782 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.116466 4823 scope.go:117] "RemoveContainer" containerID="368d9aec36767fcc7a48f7f97635938a1dba7d4d72dfda2a3c5838680ef2c087" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.143713 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.156290 4823 scope.go:117] "RemoveContainer" containerID="0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.163886 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.178639 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.190748 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.198373 4823 scope.go:117] "RemoveContainer" containerID="0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.199747 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.200308 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6c804c-4a6b-4061-95f4-9a8c96167f76" containerName="nova-manage" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.200606 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6c804c-4a6b-4061-95f4-9a8c96167f76" containerName="nova-manage" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.200731 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="registry-server" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.200812 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="registry-server" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.200903 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-log" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.200977 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-log" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.201093 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-api" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.201283 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-api" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.201400 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="init" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.201497 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="init" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.201623 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="extract-content" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.201723 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="extract-content" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.201829 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5cc73c-9d84-405e-b093-d6c721a739c8" containerName="nova-scheduler-scheduler" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.201936 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5cc73c-9d84-405e-b093-d6c721a739c8" containerName="nova-scheduler-scheduler" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.202121 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="dnsmasq-dns" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.202234 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="dnsmasq-dns" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.202341 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="extract-utilities" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.202446 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="extract-utilities" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.202752 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-api" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.202882 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6c804c-4a6b-4061-95f4-9a8c96167f76" containerName="nova-manage" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.202992 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61c0ba6-5bd0-4f7f-85f7-28eae946d2aa" containerName="dnsmasq-dns" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.203117 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5cc73c-9d84-405e-b093-d6c721a739c8" containerName="nova-scheduler-scheduler" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.203229 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c3ac16-0b4e-4828-a690-d740851a5ede" containerName="registry-server" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.203359 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" containerName="nova-api-log" Dec 16 07:21:26 crc kubenswrapper[4823]: E1216 07:21:26.201228 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733\": container with ID starting with 0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733 not found: ID does not exist" containerID="0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.204148 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733"} err="failed to get container status \"0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733\": rpc error: code = NotFound desc = could not find container \"0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733\": container with ID starting with 0cc0a9a38c2935b48929c447d78701bf7998435b0f11f9050ac1a1cdb9405733 not found: ID does not exist" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.204811 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.206861 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.207042 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.207508 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.207926 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.209408 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.210723 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.217654 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.228303 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.360817 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.360885 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d6c697-a49c-4919-81b5-6899a080d06b-logs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.360958 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-config-data\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.361163 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.361325 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.361346 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82x4g\" (UniqueName: \"kubernetes.io/projected/d3d6c697-a49c-4919-81b5-6899a080d06b-kube-api-access-82x4g\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.361371 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-public-tls-certs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.361405 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-config-data\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.361494 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m594w\" (UniqueName: \"kubernetes.io/projected/a18c5d6c-3429-4aa3-b933-85176e0e5ece-kube-api-access-m594w\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462750 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462787 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82x4g\" (UniqueName: \"kubernetes.io/projected/d3d6c697-a49c-4919-81b5-6899a080d06b-kube-api-access-82x4g\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462809 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-public-tls-certs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462835 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-config-data\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462867 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m594w\" (UniqueName: \"kubernetes.io/projected/a18c5d6c-3429-4aa3-b933-85176e0e5ece-kube-api-access-m594w\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462898 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462921 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d6c697-a49c-4919-81b5-6899a080d06b-logs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.462954 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-config-data\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.463051 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.463419 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d6c697-a49c-4919-81b5-6899a080d06b-logs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.468392 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.468828 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.468987 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.469010 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-config-data\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.470961 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-config-data\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.471976 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-public-tls-certs\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.496408 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m594w\" (UniqueName: \"kubernetes.io/projected/a18c5d6c-3429-4aa3-b933-85176e0e5ece-kube-api-access-m594w\") pod \"nova-scheduler-0\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " pod="openstack/nova-scheduler-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.497784 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82x4g\" (UniqueName: \"kubernetes.io/projected/d3d6c697-a49c-4919-81b5-6899a080d06b-kube-api-access-82x4g\") pod \"nova-api-0\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.539421 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:21:26 crc kubenswrapper[4823]: I1216 07:21:26.552537 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:21:27 crc kubenswrapper[4823]: I1216 07:21:27.064823 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:21:27 crc kubenswrapper[4823]: W1216 07:21:27.067054 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d6c697_a49c_4919_81b5_6899a080d06b.slice/crio-f6ec12bad23c8b3b31196727590c5f21c3001bb444f9eb1d2bcbda95f78a69e3 WatchSource:0}: Error finding container f6ec12bad23c8b3b31196727590c5f21c3001bb444f9eb1d2bcbda95f78a69e3: Status 404 returned error can't find the container with id f6ec12bad23c8b3b31196727590c5f21c3001bb444f9eb1d2bcbda95f78a69e3 Dec 16 07:21:27 crc kubenswrapper[4823]: I1216 07:21:27.102265 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3d6c697-a49c-4919-81b5-6899a080d06b","Type":"ContainerStarted","Data":"f6ec12bad23c8b3b31196727590c5f21c3001bb444f9eb1d2bcbda95f78a69e3"} Dec 16 07:21:27 crc kubenswrapper[4823]: I1216 07:21:27.267439 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:21:27 crc kubenswrapper[4823]: W1216 07:21:27.278931 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18c5d6c_3429_4aa3_b933_85176e0e5ece.slice/crio-b8f8e19bb3fc5f06b6f1cb8ab0b5c114739dfe21c06dc6eb8db75a76961ced2f WatchSource:0}: Error finding container b8f8e19bb3fc5f06b6f1cb8ab0b5c114739dfe21c06dc6eb8db75a76961ced2f: Status 404 returned error can't find the container with id b8f8e19bb3fc5f06b6f1cb8ab0b5c114739dfe21c06dc6eb8db75a76961ced2f Dec 16 07:21:27 crc kubenswrapper[4823]: I1216 07:21:27.786902 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541a384f-aa52-43ce-b85d-e3475ad49acd" path="/var/lib/kubelet/pods/541a384f-aa52-43ce-b85d-e3475ad49acd/volumes" Dec 16 07:21:27 crc kubenswrapper[4823]: I1216 07:21:27.788156 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c5cc73c-9d84-405e-b093-d6c721a739c8" path="/var/lib/kubelet/pods/9c5cc73c-9d84-405e-b093-d6c721a739c8/volumes" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.113102 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a18c5d6c-3429-4aa3-b933-85176e0e5ece","Type":"ContainerStarted","Data":"3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1"} Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.113141 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a18c5d6c-3429-4aa3-b933-85176e0e5ece","Type":"ContainerStarted","Data":"b8f8e19bb3fc5f06b6f1cb8ab0b5c114739dfe21c06dc6eb8db75a76961ced2f"} Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.116847 4823 generic.go:334] "Generic (PLEG): container finished" podID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerID="c62da01b313746c6d90969373bce73cebad0d872877bc5e5c836f8ce324d682f" exitCode=0 Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.116886 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b89e60a8-db7c-4e57-ac6b-aab09909e7ad","Type":"ContainerDied","Data":"c62da01b313746c6d90969373bce73cebad0d872877bc5e5c836f8ce324d682f"} Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.118272 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3d6c697-a49c-4919-81b5-6899a080d06b","Type":"ContainerStarted","Data":"f04505b3b1dfe8b6dfd28ec3fadb6fe3ba712cf0ca0ed6cc257b567eb1c5714b"} Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.118291 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3d6c697-a49c-4919-81b5-6899a080d06b","Type":"ContainerStarted","Data":"f33c995e4b22b44c31a3cf7f028d6d43a1e215de8f4963b068a6b9ffc12fa049"} Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.155548 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.155527991 podStartE2EDuration="2.155527991s" podCreationTimestamp="2025-12-16 07:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:28.129680581 +0000 UTC m=+1566.618246704" watchObservedRunningTime="2025-12-16 07:21:28.155527991 +0000 UTC m=+1566.644094114" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.268549 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.286981 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.286962937 podStartE2EDuration="2.286962937s" podCreationTimestamp="2025-12-16 07:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:28.155981215 +0000 UTC m=+1566.644547348" watchObservedRunningTime="2025-12-16 07:21:28.286962937 +0000 UTC m=+1566.775529060" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.409316 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-config-data\") pod \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.409384 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-combined-ca-bundle\") pod \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.409415 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-logs\") pod \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.409472 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64cz\" (UniqueName: \"kubernetes.io/projected/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-kube-api-access-z64cz\") pod \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.409535 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-nova-metadata-tls-certs\") pod \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\" (UID: \"b89e60a8-db7c-4e57-ac6b-aab09909e7ad\") " Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.410892 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-logs" (OuterVolumeSpecName: "logs") pod "b89e60a8-db7c-4e57-ac6b-aab09909e7ad" (UID: "b89e60a8-db7c-4e57-ac6b-aab09909e7ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.415212 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-kube-api-access-z64cz" (OuterVolumeSpecName: "kube-api-access-z64cz") pod "b89e60a8-db7c-4e57-ac6b-aab09909e7ad" (UID: "b89e60a8-db7c-4e57-ac6b-aab09909e7ad"). InnerVolumeSpecName "kube-api-access-z64cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.441072 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b89e60a8-db7c-4e57-ac6b-aab09909e7ad" (UID: "b89e60a8-db7c-4e57-ac6b-aab09909e7ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.443273 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-config-data" (OuterVolumeSpecName: "config-data") pod "b89e60a8-db7c-4e57-ac6b-aab09909e7ad" (UID: "b89e60a8-db7c-4e57-ac6b-aab09909e7ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.485859 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b89e60a8-db7c-4e57-ac6b-aab09909e7ad" (UID: "b89e60a8-db7c-4e57-ac6b-aab09909e7ad"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.516042 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.516077 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.516095 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.516106 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64cz\" (UniqueName: \"kubernetes.io/projected/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-kube-api-access-z64cz\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:28 crc kubenswrapper[4823]: I1216 07:21:28.516118 4823 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89e60a8-db7c-4e57-ac6b-aab09909e7ad-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.130304 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b89e60a8-db7c-4e57-ac6b-aab09909e7ad","Type":"ContainerDied","Data":"4e4d5dd17d1ecff3f0e2e6c39ba3e503ce1648ac35eb89efe9517e0b2056f216"} Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.130390 4823 scope.go:117] "RemoveContainer" containerID="c62da01b313746c6d90969373bce73cebad0d872877bc5e5c836f8ce324d682f" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.130464 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.187524 4823 scope.go:117] "RemoveContainer" containerID="01c19397046d515a2341974fd6714585936ff05125e45b62837c4191dd2de677" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.205522 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.226216 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.236963 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:29 crc kubenswrapper[4823]: E1216 07:21:29.237466 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-metadata" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.237491 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-metadata" Dec 16 07:21:29 crc kubenswrapper[4823]: E1216 07:21:29.237519 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-log" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.237527 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-log" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.237763 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-log" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.237801 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" containerName="nova-metadata-metadata" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.238911 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.245464 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.245658 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.248795 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.342143 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.342588 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-config-data\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.342620 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzpxq\" (UniqueName: \"kubernetes.io/projected/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-kube-api-access-rzpxq\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.342651 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.342673 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-logs\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.444641 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-config-data\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.444738 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzpxq\" (UniqueName: \"kubernetes.io/projected/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-kube-api-access-rzpxq\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.444811 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.444896 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-logs\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.445110 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.447191 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-logs\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.450015 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.450286 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-config-data\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.450486 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.483328 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzpxq\" (UniqueName: \"kubernetes.io/projected/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-kube-api-access-rzpxq\") pod \"nova-metadata-0\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.561853 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:21:29 crc kubenswrapper[4823]: I1216 07:21:29.793448 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89e60a8-db7c-4e57-ac6b-aab09909e7ad" path="/var/lib/kubelet/pods/b89e60a8-db7c-4e57-ac6b-aab09909e7ad/volumes" Dec 16 07:21:30 crc kubenswrapper[4823]: I1216 07:21:30.090633 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:21:30 crc kubenswrapper[4823]: W1216 07:21:30.092228 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d2faec4_82e9_409b_a6c1_93f8cd78b9ec.slice/crio-47e968bf3ad64e6c9679d57265619c43ab3f2db861c24ec3d5b4f2967fa690f1 WatchSource:0}: Error finding container 47e968bf3ad64e6c9679d57265619c43ab3f2db861c24ec3d5b4f2967fa690f1: Status 404 returned error can't find the container with id 47e968bf3ad64e6c9679d57265619c43ab3f2db861c24ec3d5b4f2967fa690f1 Dec 16 07:21:30 crc kubenswrapper[4823]: I1216 07:21:30.157766 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec","Type":"ContainerStarted","Data":"47e968bf3ad64e6c9679d57265619c43ab3f2db861c24ec3d5b4f2967fa690f1"} Dec 16 07:21:31 crc kubenswrapper[4823]: I1216 07:21:31.173938 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec","Type":"ContainerStarted","Data":"7c64a48ce42e0fd4916be0a094d9954ddcea66b005eeff168ca0a4dec1eb2cff"} Dec 16 07:21:31 crc kubenswrapper[4823]: I1216 07:21:31.174295 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec","Type":"ContainerStarted","Data":"3364c619253b1feab519f7ec3af4216d4032c2f42e27c3ea18c8f718e361769b"} Dec 16 07:21:31 crc kubenswrapper[4823]: I1216 07:21:31.205984 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.205963561 podStartE2EDuration="2.205963561s" podCreationTimestamp="2025-12-16 07:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:21:31.199910211 +0000 UTC m=+1569.688476344" watchObservedRunningTime="2025-12-16 07:21:31.205963561 +0000 UTC m=+1569.694529694" Dec 16 07:21:31 crc kubenswrapper[4823]: I1216 07:21:31.554237 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 07:21:34 crc kubenswrapper[4823]: I1216 07:21:34.562957 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:21:34 crc kubenswrapper[4823]: I1216 07:21:34.563618 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:21:36 crc kubenswrapper[4823]: I1216 07:21:36.539820 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:21:36 crc kubenswrapper[4823]: I1216 07:21:36.540163 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:21:36 crc kubenswrapper[4823]: I1216 07:21:36.553712 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 07:21:36 crc kubenswrapper[4823]: I1216 07:21:36.581241 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 07:21:37 crc kubenswrapper[4823]: I1216 07:21:37.281821 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 07:21:37 crc kubenswrapper[4823]: I1216 07:21:37.550172 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:37 crc kubenswrapper[4823]: I1216 07:21:37.550209 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:39 crc kubenswrapper[4823]: I1216 07:21:39.562904 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:21:39 crc kubenswrapper[4823]: I1216 07:21:39.563406 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:21:39 crc kubenswrapper[4823]: I1216 07:21:39.773199 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:21:39 crc kubenswrapper[4823]: E1216 07:21:39.773490 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:21:40 crc kubenswrapper[4823]: I1216 07:21:40.576463 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:40 crc kubenswrapper[4823]: I1216 07:21:40.576470 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:21:45 crc kubenswrapper[4823]: I1216 07:21:45.388143 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 07:21:46 crc kubenswrapper[4823]: I1216 07:21:46.548907 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:21:46 crc kubenswrapper[4823]: I1216 07:21:46.549595 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:21:46 crc kubenswrapper[4823]: I1216 07:21:46.550942 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:21:46 crc kubenswrapper[4823]: I1216 07:21:46.566811 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:21:47 crc kubenswrapper[4823]: I1216 07:21:47.330505 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:21:47 crc kubenswrapper[4823]: I1216 07:21:47.338339 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:21:49 crc kubenswrapper[4823]: I1216 07:21:49.571120 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:21:49 crc kubenswrapper[4823]: I1216 07:21:49.574881 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:21:49 crc kubenswrapper[4823]: I1216 07:21:49.580096 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 07:21:50 crc kubenswrapper[4823]: I1216 07:21:50.370815 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 07:21:54 crc kubenswrapper[4823]: I1216 07:21:54.772938 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:21:54 crc kubenswrapper[4823]: E1216 07:21:54.774300 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:22:07 crc kubenswrapper[4823]: I1216 07:22:07.778082 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:22:07 crc kubenswrapper[4823]: E1216 07:22:07.782810 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:22:07 crc kubenswrapper[4823]: I1216 07:22:07.945631 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 07:22:07 crc kubenswrapper[4823]: I1216 07:22:07.945932 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b5f6144a-70e4-4772-a8d8-2adf38127212" containerName="openstackclient" containerID="cri-o://f115ec7d425d70b2afcfd5cf1785d9ea4d296e40ca9ff51d30788a90679af605" gracePeriod=2 Dec 16 07:22:07 crc kubenswrapper[4823]: I1216 07:22:07.978094 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.272743 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.273008 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="cinder-scheduler" containerID="cri-o://dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5" gracePeriod=30 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.273287 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="probe" containerID="cri-o://457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02" gracePeriod=30 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.381654 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.400825 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.401171 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="openstack-network-exporter" containerID="cri-o://ad6913219d4e64984189276d714fe66372819d7e73a5bd2b7c37eef8a55f9181" gracePeriod=300 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.551468 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.551921 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api-log" containerID="cri-o://51565ca562af3db8782b4b38fb1d3b09a6c7f19f5c5020ef8e0d0b046831c28d" gracePeriod=30 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.552283 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api" containerID="cri-o://a2e711057ef9e93e470930a37179c721716096884ec2356c0cc2c2d27a2dddf4" gracePeriod=30 Dec 16 07:22:08 crc kubenswrapper[4823]: E1216 07:22:08.569395 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:22:08 crc kubenswrapper[4823]: E1216 07:22:08.569460 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data podName:a686a945-8fa0-406c-ac01-cf061c865a28 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:09.06944116 +0000 UTC m=+1607.558007283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data") pod "rabbitmq-server-0" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28") : configmap "rabbitmq-config-data" not found Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.592563 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": EOF" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.612141 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican5a20-account-delete-f8kwx"] Dec 16 07:22:08 crc kubenswrapper[4823]: E1216 07:22:08.612538 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f6144a-70e4-4772-a8d8-2adf38127212" containerName="openstackclient" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.612558 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f6144a-70e4-4772-a8d8-2adf38127212" containerName="openstackclient" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.612754 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f6144a-70e4-4772-a8d8-2adf38127212" containerName="openstackclient" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.613361 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.636532 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican5a20-account-delete-f8kwx"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.729000 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementf3f4-account-delete-kq7rl"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.732343 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.772136 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.772425 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="openstack-network-exporter" containerID="cri-o://06ecae0f130331b9c70dbb4604848fad60c6fe33be915c08a2e497633d78988f" gracePeriod=300 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.779935 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-operator-scripts\") pod \"barbican5a20-account-delete-f8kwx\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.780283 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrst\" (UniqueName: \"kubernetes.io/projected/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-kube-api-access-msrst\") pod \"barbican5a20-account-delete-f8kwx\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.824154 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.854373 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementf3f4-account-delete-kq7rl"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.879365 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glanceec9f-account-delete-klr92"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.880525 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.882986 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-operator-scripts\") pod \"barbican5a20-account-delete-f8kwx\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.883114 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bed5482-3232-4318-b8a0-dcfd51d8611b-operator-scripts\") pod \"placementf3f4-account-delete-kq7rl\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.883188 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbhz\" (UniqueName: \"kubernetes.io/projected/0bed5482-3232-4318-b8a0-dcfd51d8611b-kube-api-access-gcbhz\") pod \"placementf3f4-account-delete-kq7rl\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.883320 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrst\" (UniqueName: \"kubernetes.io/projected/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-kube-api-access-msrst\") pod \"barbican5a20-account-delete-f8kwx\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.884451 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-operator-scripts\") pod \"barbican5a20-account-delete-f8kwx\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.954734 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.954964 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="ovn-northd" containerID="cri-o://c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399" gracePeriod=30 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.958204 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="openstack-network-exporter" containerID="cri-o://5f969b423030012c6374edf5d132a7aa122d3b273687a37f08e1b4c115ee2b6a" gracePeriod=30 Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.985123 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d6b8\" (UniqueName: \"kubernetes.io/projected/362dcfe9-8417-425b-8eab-8bd39bf661fc-kube-api-access-7d6b8\") pod \"glanceec9f-account-delete-klr92\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.985511 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bed5482-3232-4318-b8a0-dcfd51d8611b-operator-scripts\") pod \"placementf3f4-account-delete-kq7rl\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.985604 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362dcfe9-8417-425b-8eab-8bd39bf661fc-operator-scripts\") pod \"glanceec9f-account-delete-klr92\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.985632 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbhz\" (UniqueName: \"kubernetes.io/projected/0bed5482-3232-4318-b8a0-dcfd51d8611b-kube-api-access-gcbhz\") pod \"placementf3f4-account-delete-kq7rl\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:08 crc kubenswrapper[4823]: E1216 07:22:08.986699 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:08 crc kubenswrapper[4823]: E1216 07:22:08.986753 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data podName:cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:09.48673774 +0000 UTC m=+1607.975303863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data") pod "rabbitmq-cell1-server-0" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.988592 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bed5482-3232-4318-b8a0-dcfd51d8611b-operator-scripts\") pod \"placementf3f4-account-delete-kq7rl\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:08 crc kubenswrapper[4823]: I1216 07:22:08.990871 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrst\" (UniqueName: \"kubernetes.io/projected/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-kube-api-access-msrst\") pod \"barbican5a20-account-delete-f8kwx\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.007236 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="ovsdbserver-sb" containerID="cri-o://90bb6f7603a93a35c6ff65c8dd4f67d20079e1d4acfdcadb0ec6ae63addd6404" gracePeriod=300 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.016922 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-q69qd"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.034456 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbhz\" (UniqueName: \"kubernetes.io/projected/0bed5482-3232-4318-b8a0-dcfd51d8611b-kube-api-access-gcbhz\") pod \"placementf3f4-account-delete-kq7rl\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.051505 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-q69qd"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.056670 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.089104 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glanceec9f-account-delete-klr92"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.090511 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362dcfe9-8417-425b-8eab-8bd39bf661fc-operator-scripts\") pod \"glanceec9f-account-delete-klr92\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.090616 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d6b8\" (UniqueName: \"kubernetes.io/projected/362dcfe9-8417-425b-8eab-8bd39bf661fc-kube-api-access-7d6b8\") pod \"glanceec9f-account-delete-klr92\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:09 crc kubenswrapper[4823]: E1216 07:22:09.090945 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:22:09 crc kubenswrapper[4823]: E1216 07:22:09.090990 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data podName:a686a945-8fa0-406c-ac01-cf061c865a28 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:10.090976015 +0000 UTC m=+1608.579542138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data") pod "rabbitmq-server-0" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28") : configmap "rabbitmq-config-data" not found Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.091892 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362dcfe9-8417-425b-8eab-8bd39bf661fc-operator-scripts\") pod \"glanceec9f-account-delete-klr92\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.115548 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d6b8\" (UniqueName: \"kubernetes.io/projected/362dcfe9-8417-425b-8eab-8bd39bf661fc-kube-api-access-7d6b8\") pod \"glanceec9f-account-delete-klr92\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.127806 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapic1ba-account-delete-sldxg"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.129396 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.131224 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.151305 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7mm88"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.172294 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7mm88"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.234663 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts\") pod \"novaapic1ba-account-delete-sldxg\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.234741 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qhh\" (UniqueName: \"kubernetes.io/projected/65278526-b5ee-4e40-b66b-1ee9b993f429-kube-api-access-p5qhh\") pod \"novaapic1ba-account-delete-sldxg\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.244598 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapic1ba-account-delete-sldxg"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.245779 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.304871 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell06c77-account-delete-5jkkk"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.306210 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.338251 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvfw7\" (UniqueName: \"kubernetes.io/projected/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-kube-api-access-rvfw7\") pod \"novacell06c77-account-delete-5jkkk\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.338378 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts\") pod \"novacell06c77-account-delete-5jkkk\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.338430 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts\") pod \"novaapic1ba-account-delete-sldxg\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.338461 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qhh\" (UniqueName: \"kubernetes.io/projected/65278526-b5ee-4e40-b66b-1ee9b993f429-kube-api-access-p5qhh\") pod \"novaapic1ba-account-delete-sldxg\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.339823 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts\") pod \"novaapic1ba-account-delete-sldxg\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.339865 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell06c77-account-delete-5jkkk"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.369313 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qhh\" (UniqueName: \"kubernetes.io/projected/65278526-b5ee-4e40-b66b-1ee9b993f429-kube-api-access-p5qhh\") pod \"novaapic1ba-account-delete-sldxg\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.372727 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-29jcz"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.388885 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-l8nbv"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.389201 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="dnsmasq-dns" containerID="cri-o://14f9aa7c5d7c0e6bdf53c979b009b546f44e6652421ca6154616d807431fa6e2" gracePeriod=10 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.418653 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder1f3e-account-delete-q5pns"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.419987 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.439948 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvfw7\" (UniqueName: \"kubernetes.io/projected/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-kube-api-access-rvfw7\") pod \"novacell06c77-account-delete-5jkkk\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.440122 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts\") pod \"novacell06c77-account-delete-5jkkk\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.445696 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts\") pod \"novacell06c77-account-delete-5jkkk\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.446089 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvqqp"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.474216 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-956hc"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.474488 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-956hc" podUID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" containerName="openstack-network-exporter" containerID="cri-o://fd126ddb078fca0b47691ccb775b5689a84f7d7d50e7281488f17b418ac9e03a" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.488963 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder1f3e-account-delete-q5pns"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.497522 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvfw7\" (UniqueName: \"kubernetes.io/projected/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-kube-api-access-rvfw7\") pod \"novacell06c77-account-delete-5jkkk\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.515088 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzmlg"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.534479 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2mqx2"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.567716 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts\") pod \"cinder1f3e-account-delete-q5pns\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.567877 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98cf7\" (UniqueName: \"kubernetes.io/projected/ec00a24a-8417-452e-a350-b46f36d4a84d-kube-api-access-98cf7\") pod \"cinder1f3e-account-delete-q5pns\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: E1216 07:22:09.567977 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:09 crc kubenswrapper[4823]: E1216 07:22:09.568163 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data podName:cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:10.568141759 +0000 UTC m=+1609.056707882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data") pod "rabbitmq-cell1-server-0" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.578506 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvf8j"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.593492 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzmlg"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.593614 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="ovsdbserver-nb" containerID="cri-o://58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b" gracePeriod=299 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.614292 4823 generic.go:334] "Generic (PLEG): container finished" podID="b566f9ee-8a75-4041-aac4-1573ca610541" containerID="ad6913219d4e64984189276d714fe66372819d7e73a5bd2b7c37eef8a55f9181" exitCode=2 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.614355 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b566f9ee-8a75-4041-aac4-1573ca610541","Type":"ContainerDied","Data":"ad6913219d4e64984189276d714fe66372819d7e73a5bd2b7c37eef8a55f9181"} Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.616179 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvf8j"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.635477 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2mqx2"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.639422 4823 generic.go:334] "Generic (PLEG): container finished" podID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerID="06ecae0f130331b9c70dbb4604848fad60c6fe33be915c08a2e497633d78988f" exitCode=2 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.639594 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"603d469a-39a2-4d84-87cb-f2c7499b7a28","Type":"ContainerDied","Data":"06ecae0f130331b9c70dbb4604848fad60c6fe33be915c08a2e497633d78988f"} Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.659141 4823 generic.go:334] "Generic (PLEG): container finished" podID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerID="14f9aa7c5d7c0e6bdf53c979b009b546f44e6652421ca6154616d807431fa6e2" exitCode=0 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.659283 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" event={"ID":"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f","Type":"ContainerDied","Data":"14f9aa7c5d7c0e6bdf53c979b009b546f44e6652421ca6154616d807431fa6e2"} Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.671148 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98cf7\" (UniqueName: \"kubernetes.io/projected/ec00a24a-8417-452e-a350-b46f36d4a84d-kube-api-access-98cf7\") pod \"cinder1f3e-account-delete-q5pns\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.671297 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts\") pod \"cinder1f3e-account-delete-q5pns\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.673928 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts\") pod \"cinder1f3e-account-delete-q5pns\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.674311 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cfd02f05-0804-48c6-b9b4-cda88fd6b14a/ovn-northd/0.log" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.674355 4823 generic.go:334] "Generic (PLEG): container finished" podID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerID="5f969b423030012c6374edf5d132a7aa122d3b273687a37f08e1b4c115ee2b6a" exitCode=2 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.674371 4823 generic.go:334] "Generic (PLEG): container finished" podID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerID="c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399" exitCode=143 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.674429 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cfd02f05-0804-48c6-b9b4-cda88fd6b14a","Type":"ContainerDied","Data":"5f969b423030012c6374edf5d132a7aa122d3b273687a37f08e1b4c115ee2b6a"} Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.674459 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cfd02f05-0804-48c6-b9b4-cda88fd6b14a","Type":"ContainerDied","Data":"c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399"} Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.674498 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronba48-account-delete-87d8j"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.690262 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.718820 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98cf7\" (UniqueName: \"kubernetes.io/projected/ec00a24a-8417-452e-a350-b46f36d4a84d-kube-api-access-98cf7\") pod \"cinder1f3e-account-delete-q5pns\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.721876 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronba48-account-delete-87d8j"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.754476 4823 generic.go:334] "Generic (PLEG): container finished" podID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerID="51565ca562af3db8782b4b38fb1d3b09a6c7f19f5c5020ef8e0d0b046831c28d" exitCode=143 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.754532 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17cbb31a-6067-4925-ba57-956baf53ce8b","Type":"ContainerDied","Data":"51565ca562af3db8782b4b38fb1d3b09a6c7f19f5c5020ef8e0d0b046831c28d"} Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.774414 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts\") pod \"neutronba48-account-delete-87d8j\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.774496 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7vwd\" (UniqueName: \"kubernetes.io/projected/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-kube-api-access-j7vwd\") pod \"neutronba48-account-delete-87d8j\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.839375 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34693374-b301-47b2-b909-b5b93fd96fd0" path="/var/lib/kubelet/pods/34693374-b301-47b2-b909-b5b93fd96fd0/volumes" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.840104 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4506b142-a95e-4cf3-a000-56fbee5e024d" path="/var/lib/kubelet/pods/4506b142-a95e-4cf3-a000-56fbee5e024d/volumes" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.849210 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c74f3a-8b4c-47eb-8d8d-af32e667d121" path="/var/lib/kubelet/pods/59c74f3a-8b4c-47eb-8d8d-af32e667d121/volumes" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.849854 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6c804c-4a6b-4061-95f4-9a8c96167f76" path="/var/lib/kubelet/pods/6b6c804c-4a6b-4061-95f4-9a8c96167f76/volumes" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.850638 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b" path="/var/lib/kubelet/pods/f10ce7b3-53e0-4318-b7a2-1d2a33b9eb3b/volumes" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.854646 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nhgg2"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.854697 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-nhgg2"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.863187 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-n2br8"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.879827 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts\") pod \"neutronba48-account-delete-87d8j\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.880181 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7vwd\" (UniqueName: \"kubernetes.io/projected/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-kube-api-access-j7vwd\") pod \"neutronba48-account-delete-87d8j\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.882434 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts\") pod \"neutronba48-account-delete-87d8j\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.896082 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-n2br8"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.904714 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hzvj8"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.934904 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7vwd\" (UniqueName: \"kubernetes.io/projected/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-kube-api-access-j7vwd\") pod \"neutronba48-account-delete-87d8j\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.934986 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hzvj8"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.985202 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.985666 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-server" containerID="cri-o://a492d0597a24fbc3874db2d66724810617a47a1b04e07bd6166546bf01c14b03" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.986109 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="swift-recon-cron" containerID="cri-o://f87675dcfff9fc973b357762f0993278cb4dedf83d6ea269b8db0911d6c505df" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.986153 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="rsync" containerID="cri-o://145ae0bd995a296d5194b205c5a110eae0cc0b53171f8ed6f7aab0a0e2c48aca" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.986189 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-expirer" containerID="cri-o://fba9f42156608e6cc226456c4628eb8a6093a4e736f19553c3b609538523e305" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.986223 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-updater" containerID="cri-o://40ee29e6ae29936dd852b2034a257b376daf068184e991736706829246c42569" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987507 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-replicator" containerID="cri-o://3e8bd97535cc7d73ba58df356afd74ec5adc282b78f6bd60a29d41243373dfe8" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987726 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-auditor" containerID="cri-o://b1a5f1f8235f35f66a00999ce9d7e06be67e6583b5dc430df80fd71d14a63993" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987791 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-replicator" containerID="cri-o://c5d6df967dd64ce250c15ed15f061a8be5c2ace3ce71f17ecbb4eeb82eee16bb" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987845 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-server" containerID="cri-o://01e5b8f2f03cdaee2d9aa0f7009e062e757b69095af1ac126d2b409afda22307" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987892 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-updater" containerID="cri-o://0867818f24c8ec64e592ab31aa5d2950ef78f3e7e0fe1694feaadae8d16fd195" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987985 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-auditor" containerID="cri-o://037ada7a883b0afa2d539ebbbabaf8e1ff97dd775ed349460d0029680d2b1517" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.987995 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-auditor" containerID="cri-o://bad977d222921a4fb519d95600bc9d018f6a41b0993e19b99e544f9729b364ec" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.988054 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-server" containerID="cri-o://6989d85752f4e1b6c7b23a46754686007edf09212f93e356aa9e002490d63f86" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.988111 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-reaper" containerID="cri-o://9857b55eb51a54f3ae493111d268c42a0d2bc195ef3b7082fc757220e93cba07" gracePeriod=30 Dec 16 07:22:09 crc kubenswrapper[4823]: I1216 07:22:09.988121 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-replicator" containerID="cri-o://e40b9ddd3f7fc60ce93f808d19e11679050ad9b41de42d02b22ca40a92083f09" gracePeriod=30 Dec 16 07:22:10 crc kubenswrapper[4823]: I1216 07:22:10.008969 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-59fd5f5fb-h7tf5"] Dec 16 07:22:10 crc kubenswrapper[4823]: I1216 07:22:10.009266 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-59fd5f5fb-h7tf5" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-log" containerID="cri-o://754f57f4d21e96f08486902a1f29fc3d73326be71cf93cc74a912ea8e5adfbfe" gracePeriod=30 Dec 16 07:22:10 crc kubenswrapper[4823]: I1216 07:22:10.009602 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-59fd5f5fb-h7tf5" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-api" containerID="cri-o://fa7ad139671c8c3444b9e62aff507fb0fc6b2d2d087722f71ba9f8cc7977708c" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.065278 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.097181 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.097252 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data podName:a686a945-8fa0-406c-ac01-cf061c865a28 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:12.09723564 +0000 UTC m=+1610.585801763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data") pod "rabbitmq-server-0" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28") : configmap "rabbitmq-config-data" not found Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.106297 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.106542 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-log" containerID="cri-o://966c9a295917276f353f9e97ebb9a673f7628bec540b8b5ef3aef083889d35ba" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.106783 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-httpd" containerID="cri-o://9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.170438 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.171044 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerName="nova-scheduler-scheduler" containerID="cri-o://3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.189331 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="rabbitmq" containerID="cri-o://36a98e82cbcb4bee731b20517aebf25ec378c019a17c67f3b8b2c9437196612b" gracePeriod=604800 Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.225221 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399 is running failed: container process not found" containerID="c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.227706 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399 is running failed: container process not found" containerID="c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.228897 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.229260 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-log" containerID="cri-o://3ce6c26d6258938fda89230a518530d00595939cb83c8d60892c9449174748b0" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.229375 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-httpd" containerID="cri-o://a3797feae0da2f46b99e7827ab8d4f11114590dcdde7cc7247a8b58f538e9505" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.229614 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399 is running failed: container process not found" containerID="c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.229647 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="ovn-northd" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.256809 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.309283 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.330495 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.334146 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bbf9986cc-sjljb"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.342373 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bbf9986cc-sjljb" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-httpd" containerID="cri-o://63b9a035e047de6a0a1943c6d043167a9dedd896ef10da24426158630e0de9b7" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.344554 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bbf9986cc-sjljb" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-api" containerID="cri-o://4a902115438412f167a7c224fe223d644746f437002cb2288beb05ad185be48a" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.357641 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.446556 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-75996b444f-cfsnf"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.446874 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-75996b444f-cfsnf" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-httpd" containerID="cri-o://d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.447469 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-75996b444f-cfsnf" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-server" containerID="cri-o://4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.454080 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.472441 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.472828 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-log" containerID="cri-o://3364c619253b1feab519f7ec3af4216d4032c2f42e27c3ea18c8f718e361769b" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.473812 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-metadata" containerID="cri-o://7c64a48ce42e0fd4916be0a094d9954ddcea66b005eeff168ca0a4dec1eb2cff" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.488220 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-99f9cf477-cj5ss"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.488549 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-99f9cf477-cj5ss" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker-log" containerID="cri-o://831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.489077 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-99f9cf477-cj5ss" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker" containerID="cri-o://cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.501089 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-749d6ff74-w7lnp"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.501359 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener-log" containerID="cri-o://8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.501792 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener" containerID="cri-o://e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.515153 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6456ccccf4-rhnf4"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.515490 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6456ccccf4-rhnf4" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api-log" containerID="cri-o://6e803790a094c100a2004f1b22829f8f62d04305a0ff039b94d3de7aaff12828" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.516012 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6456ccccf4-rhnf4" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api" containerID="cri-o://530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.525978 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.526287 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-log" containerID="cri-o://f33c995e4b22b44c31a3cf7f028d6d43a1e215de8f4963b068a6b9ffc12fa049" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.526462 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-api" containerID="cri-o://f04505b3b1dfe8b6dfd28ec3fadb6fe3ba712cf0ca0ed6cc257b567eb1c5714b" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.547218 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qm72p"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.592965 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qm72p"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.632240 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.632317 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data podName:cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:12.632297589 +0000 UTC m=+1611.120863712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data") pod "rabbitmq-cell1-server-0" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.653017 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.653327 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" containerName="nova-cell0-conductor-conductor" containerID="cri-o://51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.682743 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" containerID="cri-o://143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" gracePeriod=29 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.682896 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c9snq"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.721202 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.721529 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" containerName="nova-cell1-conductor-conductor" containerID="cri-o://e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.741319 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c9snq"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.757629 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b is running failed: container process not found" containerID="58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.764785 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.764983 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b is running failed: container process not found" containerID="58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.765924 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b is running failed: container process not found" containerID="58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.765974 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="ovsdbserver-nb" Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.766300 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.769848 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ms77f"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.770101 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.770129 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" containerName="nova-cell1-conductor-conductor" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.781992 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.782232 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="dbb285b0-26ce-494d-9d69-8fe905e39469" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dcb6ee461f8c315b99af0cef59bee6ad1bc80844d030f1935cac757ed7544094" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.794006 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ms77f"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.795988 4823 generic.go:334] "Generic (PLEG): container finished" podID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerID="3ce6c26d6258938fda89230a518530d00595939cb83c8d60892c9449174748b0" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.796104 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925","Type":"ContainerDied","Data":"3ce6c26d6258938fda89230a518530d00595939cb83c8d60892c9449174748b0"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.811129 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-15f7-account-create-update-tdlw5"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.820197 4823 generic.go:334] "Generic (PLEG): container finished" podID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerID="6e803790a094c100a2004f1b22829f8f62d04305a0ff039b94d3de7aaff12828" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.820248 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6456ccccf4-rhnf4" event={"ID":"c559ee21-de8f-44a1-998a-cb0b4aff8cd7","Type":"ContainerDied","Data":"6e803790a094c100a2004f1b22829f8f62d04305a0ff039b94d3de7aaff12828"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.825180 4823 generic.go:334] "Generic (PLEG): container finished" podID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerID="457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.825222 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a27cd126-6c5b-4e95-b313-0bb19568f42a","Type":"ContainerDied","Data":"457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.826717 4823 generic.go:334] "Generic (PLEG): container finished" podID="b5f6144a-70e4-4772-a8d8-2adf38127212" containerID="f115ec7d425d70b2afcfd5cf1785d9ea4d296e40ca9ff51d30788a90679af605" exitCode=137 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.829091 4823 generic.go:334] "Generic (PLEG): container finished" podID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerID="966c9a295917276f353f9e97ebb9a673f7628bec540b8b5ef3aef083889d35ba" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.829159 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50","Type":"ContainerDied","Data":"966c9a295917276f353f9e97ebb9a673f7628bec540b8b5ef3aef083889d35ba"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.831959 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_603d469a-39a2-4d84-87cb-f2c7499b7a28/ovsdbserver-sb/0.log" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.832003 4823 generic.go:334] "Generic (PLEG): container finished" podID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerID="90bb6f7603a93a35c6ff65c8dd4f67d20079e1d4acfdcadb0ec6ae63addd6404" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.832180 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-15f7-account-create-update-tdlw5"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.832212 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"603d469a-39a2-4d84-87cb-f2c7499b7a28","Type":"ContainerDied","Data":"90bb6f7603a93a35c6ff65c8dd4f67d20079e1d4acfdcadb0ec6ae63addd6404"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.834204 4823 generic.go:334] "Generic (PLEG): container finished" podID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerID="3364c619253b1feab519f7ec3af4216d4032c2f42e27c3ea18c8f718e361769b" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.834263 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec","Type":"ContainerDied","Data":"3364c619253b1feab519f7ec3af4216d4032c2f42e27c3ea18c8f718e361769b"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.834704 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerName="galera" containerID="cri-o://28af7097fe36966795ffd4f08fbf3fc9b6142fd27eb3db8592b4ce75e52927e8" gracePeriod=30 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.849164 4823 generic.go:334] "Generic (PLEG): container finished" podID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerID="754f57f4d21e96f08486902a1f29fc3d73326be71cf93cc74a912ea8e5adfbfe" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.849243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd5f5fb-h7tf5" event={"ID":"196356f3-e866-4cf1-b3e8-eba3d9e4c99f","Type":"ContainerDied","Data":"754f57f4d21e96f08486902a1f29fc3d73326be71cf93cc74a912ea8e5adfbfe"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.856577 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.868687 4823 generic.go:334] "Generic (PLEG): container finished" podID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerID="63b9a035e047de6a0a1943c6d043167a9dedd896ef10da24426158630e0de9b7" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.868801 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf9986cc-sjljb" event={"ID":"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b","Type":"ContainerDied","Data":"63b9a035e047de6a0a1943c6d043167a9dedd896ef10da24426158630e0de9b7"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.872966 4823 generic.go:334] "Generic (PLEG): container finished" podID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerID="8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.873054 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" event={"ID":"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6","Type":"ContainerDied","Data":"8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.875206 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementf3f4-account-delete-kq7rl"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.877871 4823 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 07:22:11 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 16 07:22:11 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNBridge=br-int Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNRemote=tcp:localhost:6642 Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNEncapType=geneve Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNAvailabilityZones= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ EnableChassisAsGateway=true Dec 16 07:22:11 crc kubenswrapper[4823]: ++ PhysicalNetworks= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNHostName= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 16 07:22:11 crc kubenswrapper[4823]: ++ ovs_dir=/var/lib/openvswitch Dec 16 07:22:11 crc kubenswrapper[4823]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 16 07:22:11 crc kubenswrapper[4823]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 16 07:22:11 crc kubenswrapper[4823]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + sleep 0.5 Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + sleep 0.5 Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + cleanup_ovsdb_server_semaphore Dec 16 07:22:11 crc kubenswrapper[4823]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:22:11 crc kubenswrapper[4823]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 16 07:22:11 crc kubenswrapper[4823]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-29jcz" message=< Dec 16 07:22:11 crc kubenswrapper[4823]: Exiting ovsdb-server (5) [ OK ] Dec 16 07:22:11 crc kubenswrapper[4823]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 16 07:22:11 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNBridge=br-int Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNRemote=tcp:localhost:6642 Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNEncapType=geneve Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNAvailabilityZones= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ EnableChassisAsGateway=true Dec 16 07:22:11 crc kubenswrapper[4823]: ++ PhysicalNetworks= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNHostName= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 16 07:22:11 crc kubenswrapper[4823]: ++ ovs_dir=/var/lib/openvswitch Dec 16 07:22:11 crc kubenswrapper[4823]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 16 07:22:11 crc kubenswrapper[4823]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 16 07:22:11 crc kubenswrapper[4823]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + sleep 0.5 Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + sleep 0.5 Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + cleanup_ovsdb_server_semaphore Dec 16 07:22:11 crc kubenswrapper[4823]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:22:11 crc kubenswrapper[4823]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 16 07:22:11 crc kubenswrapper[4823]: > Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:10.877907 4823 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 07:22:11 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 16 07:22:11 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNBridge=br-int Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNRemote=tcp:localhost:6642 Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNEncapType=geneve Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNAvailabilityZones= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ EnableChassisAsGateway=true Dec 16 07:22:11 crc kubenswrapper[4823]: ++ PhysicalNetworks= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ OVNHostName= Dec 16 07:22:11 crc kubenswrapper[4823]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 16 07:22:11 crc kubenswrapper[4823]: ++ ovs_dir=/var/lib/openvswitch Dec 16 07:22:11 crc kubenswrapper[4823]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 16 07:22:11 crc kubenswrapper[4823]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 16 07:22:11 crc kubenswrapper[4823]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + sleep 0.5 Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + sleep 0.5 Dec 16 07:22:11 crc kubenswrapper[4823]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:22:11 crc kubenswrapper[4823]: + cleanup_ovsdb_server_semaphore Dec 16 07:22:11 crc kubenswrapper[4823]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:22:11 crc kubenswrapper[4823]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 16 07:22:11 crc kubenswrapper[4823]: > pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" containerID="cri-o://a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.877939 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" containerID="cri-o://a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" gracePeriod=29 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.879615 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-956hc_efe17b2e-19bd-430b-8cb5-147ed1d2ffb6/openstack-network-exporter/0.log" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.879645 4823 generic.go:334] "Generic (PLEG): container finished" podID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" containerID="fd126ddb078fca0b47691ccb775b5689a84f7d7d50e7281488f17b418ac9e03a" exitCode=2 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.879691 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-956hc" event={"ID":"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6","Type":"ContainerDied","Data":"fd126ddb078fca0b47691ccb775b5689a84f7d7d50e7281488f17b418ac9e03a"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.887259 4823 generic.go:334] "Generic (PLEG): container finished" podID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerID="f33c995e4b22b44c31a3cf7f028d6d43a1e215de8f4963b068a6b9ffc12fa049" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.887321 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3d6c697-a49c-4919-81b5-6899a080d06b","Type":"ContainerDied","Data":"f33c995e4b22b44c31a3cf7f028d6d43a1e215de8f4963b068a6b9ffc12fa049"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.893657 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementf3f4-account-delete-kq7rl" event={"ID":"0bed5482-3232-4318-b8a0-dcfd51d8611b","Type":"ContainerStarted","Data":"83d52232f6a5bf5bea1ad6fb6f6555d9c4a68c8bc8cbded87638ec5ec23f8f1e"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904441 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="145ae0bd995a296d5194b205c5a110eae0cc0b53171f8ed6f7aab0a0e2c48aca" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904467 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="fba9f42156608e6cc226456c4628eb8a6093a4e736f19553c3b609538523e305" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904475 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="40ee29e6ae29936dd852b2034a257b376daf068184e991736706829246c42569" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904482 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="b1a5f1f8235f35f66a00999ce9d7e06be67e6583b5dc430df80fd71d14a63993" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904488 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="c5d6df967dd64ce250c15ed15f061a8be5c2ace3ce71f17ecbb4eeb82eee16bb" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904494 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="0867818f24c8ec64e592ab31aa5d2950ef78f3e7e0fe1694feaadae8d16fd195" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904500 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="bad977d222921a4fb519d95600bc9d018f6a41b0993e19b99e544f9729b364ec" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904508 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="3e8bd97535cc7d73ba58df356afd74ec5adc282b78f6bd60a29d41243373dfe8" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904514 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="9857b55eb51a54f3ae493111d268c42a0d2bc195ef3b7082fc757220e93cba07" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904520 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="037ada7a883b0afa2d539ebbbabaf8e1ff97dd775ed349460d0029680d2b1517" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904526 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="e40b9ddd3f7fc60ce93f808d19e11679050ad9b41de42d02b22ca40a92083f09" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904532 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="a492d0597a24fbc3874db2d66724810617a47a1b04e07bd6166546bf01c14b03" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904568 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"145ae0bd995a296d5194b205c5a110eae0cc0b53171f8ed6f7aab0a0e2c48aca"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904594 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"fba9f42156608e6cc226456c4628eb8a6093a4e736f19553c3b609538523e305"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904604 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"40ee29e6ae29936dd852b2034a257b376daf068184e991736706829246c42569"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904613 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"b1a5f1f8235f35f66a00999ce9d7e06be67e6583b5dc430df80fd71d14a63993"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904621 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"c5d6df967dd64ce250c15ed15f061a8be5c2ace3ce71f17ecbb4eeb82eee16bb"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904629 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"0867818f24c8ec64e592ab31aa5d2950ef78f3e7e0fe1694feaadae8d16fd195"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904637 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"bad977d222921a4fb519d95600bc9d018f6a41b0993e19b99e544f9729b364ec"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904645 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"3e8bd97535cc7d73ba58df356afd74ec5adc282b78f6bd60a29d41243373dfe8"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904653 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"9857b55eb51a54f3ae493111d268c42a0d2bc195ef3b7082fc757220e93cba07"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904660 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"037ada7a883b0afa2d539ebbbabaf8e1ff97dd775ed349460d0029680d2b1517"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904668 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"e40b9ddd3f7fc60ce93f808d19e11679050ad9b41de42d02b22ca40a92083f09"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.904676 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"a492d0597a24fbc3874db2d66724810617a47a1b04e07bd6166546bf01c14b03"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.906853 4823 generic.go:334] "Generic (PLEG): container finished" podID="b566f9ee-8a75-4041-aac4-1573ca610541" containerID="58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.906875 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b566f9ee-8a75-4041-aac4-1573ca610541","Type":"ContainerDied","Data":"58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b"} Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:10.911226 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="rabbitmq" containerID="cri-o://51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720" gracePeriod=604800 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.172954 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.340233 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-956hc_efe17b2e-19bd-430b-8cb5-147ed1d2ffb6/openstack-network-exporter/0.log" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.340330 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.355977 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_603d469a-39a2-4d84-87cb-f2c7499b7a28/ovsdbserver-sb/0.log" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.356466 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.384889 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cfd02f05-0804-48c6-b9b4-cda88fd6b14a/ovn-northd/0.log" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.385258 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453695 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7694w\" (UniqueName: \"kubernetes.io/projected/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-kube-api-access-7694w\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453765 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5pmm\" (UniqueName: \"kubernetes.io/projected/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-kube-api-access-b5pmm\") pod \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453850 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-rundir\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453871 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdb-rundir\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453912 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-combined-ca-bundle\") pod \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453933 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-combined-ca-bundle\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.453966 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-scripts\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.455898 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.455921 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovn-rundir\") pod \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.455942 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-config\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.455965 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-config\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.455984 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-metrics-certs-tls-certs\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.455999 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovs-rundir\") pod \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456040 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-scripts\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456065 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-northd-tls-certs\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456144 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckcq8\" (UniqueName: \"kubernetes.io/projected/603d469a-39a2-4d84-87cb-f2c7499b7a28-kube-api-access-ckcq8\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456161 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-config\") pod \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456187 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-combined-ca-bundle\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456227 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdbserver-sb-tls-certs\") pod \"603d469a-39a2-4d84-87cb-f2c7499b7a28\" (UID: \"603d469a-39a2-4d84-87cb-f2c7499b7a28\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-metrics-certs-tls-certs\") pod \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\" (UID: \"cfd02f05-0804-48c6-b9b4-cda88fd6b14a\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.456263 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-metrics-certs-tls-certs\") pod \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\" (UID: \"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.454779 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.464678 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" (UID: "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.465312 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-scripts" (OuterVolumeSpecName: "scripts") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.465832 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.467496 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" (UID: "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.467585 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-scripts" (OuterVolumeSpecName: "scripts") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.468632 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-config" (OuterVolumeSpecName: "config") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.469352 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-config" (OuterVolumeSpecName: "config") pod "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" (UID: "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.470205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-config" (OuterVolumeSpecName: "config") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.482421 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-kube-api-access-7694w" (OuterVolumeSpecName: "kube-api-access-7694w") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "kube-api-access-7694w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.482616 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603d469a-39a2-4d84-87cb-f2c7499b7a28-kube-api-access-ckcq8" (OuterVolumeSpecName: "kube-api-access-ckcq8") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "kube-api-access-ckcq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.483628 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.486971 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-kube-api-access-b5pmm" (OuterVolumeSpecName: "kube-api-access-b5pmm") pod "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" (UID: "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6"). InnerVolumeSpecName "kube-api-access-b5pmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558091 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558147 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558163 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558174 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558185 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558196 4823 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558203 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/603d469a-39a2-4d84-87cb-f2c7499b7a28-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558214 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckcq8\" (UniqueName: \"kubernetes.io/projected/603d469a-39a2-4d84-87cb-f2c7499b7a28-kube-api-access-ckcq8\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558229 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558240 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7694w\" (UniqueName: \"kubernetes.io/projected/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-kube-api-access-7694w\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558252 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5pmm\" (UniqueName: \"kubernetes.io/projected/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-kube-api-access-b5pmm\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558261 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558269 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.558958 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" (UID: "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:11.561777 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.569802 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:11.576586 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:11.579982 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:22:11 crc kubenswrapper[4823]: E1216 07:22:11.580078 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerName="nova-scheduler-scheduler" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.584677 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.607858 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.611951 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.638310 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.641333 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.646322 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.654597 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662401 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-combined-ca-bundle\") pod \"b5f6144a-70e4-4772-a8d8-2adf38127212\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662452 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-svc\") pod \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662477 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-log-httpd\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662512 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-metrics-certs-tls-certs\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662541 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-config\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662561 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-swift-storage-0\") pod \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662578 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-combined-ca-bundle\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662600 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-sb\") pod \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662623 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-config\") pod \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662639 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdbserver-nb-tls-certs\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662688 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-etc-swift\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662711 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config-secret\") pod \"b5f6144a-70e4-4772-a8d8-2adf38127212\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662735 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqqf6\" (UniqueName: \"kubernetes.io/projected/b5f6144a-70e4-4772-a8d8-2adf38127212-kube-api-access-fqqf6\") pod \"b5f6144a-70e4-4772-a8d8-2adf38127212\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662769 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmkg\" (UniqueName: \"kubernetes.io/projected/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-kube-api-access-4vmkg\") pod \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662801 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tbqm\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-kube-api-access-6tbqm\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662819 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-nb\") pod \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\" (UID: \"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662897 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdb-rundir\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662917 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662932 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config\") pod \"b5f6144a-70e4-4772-a8d8-2adf38127212\" (UID: \"b5f6144a-70e4-4772-a8d8-2adf38127212\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662950 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-run-httpd\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662964 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-internal-tls-certs\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.662985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksj4k\" (UniqueName: \"kubernetes.io/projected/b566f9ee-8a75-4041-aac4-1573ca610541-kube-api-access-ksj4k\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.663007 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-scripts\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.663626 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-config" (OuterVolumeSpecName: "config") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.664892 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.668211 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b566f9ee-8a75-4041-aac4-1573ca610541\" (UID: \"b566f9ee-8a75-4041-aac4-1573ca610541\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.668245 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-config-data\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.668299 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-public-tls-certs\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669002 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669035 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669045 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669055 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669063 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669072 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.669651 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.673515 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.674001 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.675125 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.675478 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-scripts" (OuterVolumeSpecName: "scripts") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.705004 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.706994 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" (UID: "efe17b2e-19bd-430b-8cb5-147ed1d2ffb6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.708756 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-kube-api-access-6tbqm" (OuterVolumeSpecName: "kube-api-access-6tbqm") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "kube-api-access-6tbqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.708819 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-kube-api-access-4vmkg" (OuterVolumeSpecName: "kube-api-access-4vmkg") pod "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" (UID: "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f"). InnerVolumeSpecName "kube-api-access-4vmkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.709158 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f6144a-70e4-4772-a8d8-2adf38127212-kube-api-access-fqqf6" (OuterVolumeSpecName: "kube-api-access-fqqf6") pod "b5f6144a-70e4-4772-a8d8-2adf38127212" (UID: "b5f6144a-70e4-4772-a8d8-2adf38127212"). InnerVolumeSpecName "kube-api-access-fqqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.716424 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cfd02f05-0804-48c6-b9b4-cda88fd6b14a" (UID: "cfd02f05-0804-48c6-b9b4-cda88fd6b14a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.720100 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "603d469a-39a2-4d84-87cb-f2c7499b7a28" (UID: "603d469a-39a2-4d84-87cb-f2c7499b7a28"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.720700 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b566f9ee-8a75-4041-aac4-1573ca610541-kube-api-access-ksj4k" (OuterVolumeSpecName: "kube-api-access-ksj4k") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "kube-api-access-ksj4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.726263 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.768231 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5f6144a-70e4-4772-a8d8-2adf38127212" (UID: "b5f6144a-70e4-4772-a8d8-2adf38127212"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.770048 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b5f6144a-70e4-4772-a8d8-2adf38127212" (UID: "b5f6144a-70e4-4772-a8d8-2adf38127212"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772363 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772427 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772446 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772461 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772473 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772511 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772523 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqqf6\" (UniqueName: \"kubernetes.io/projected/b5f6144a-70e4-4772-a8d8-2adf38127212-kube-api-access-fqqf6\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772535 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmkg\" (UniqueName: \"kubernetes.io/projected/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-kube-api-access-4vmkg\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772547 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/603d469a-39a2-4d84-87cb-f2c7499b7a28-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772581 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfd02f05-0804-48c6-b9b4-cda88fd6b14a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772594 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tbqm\" (UniqueName: \"kubernetes.io/projected/acfde95a-b68d-4aee-9302-a81c73eafa99-kube-api-access-6tbqm\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772607 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772618 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772628 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acfde95a-b68d-4aee-9302-a81c73eafa99-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772743 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksj4k\" (UniqueName: \"kubernetes.io/projected/b566f9ee-8a75-4041-aac4-1573ca610541-kube-api-access-ksj4k\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.772755 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b566f9ee-8a75-4041-aac4-1573ca610541-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.801819 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119de702-bd92-49d3-8bef-ba0fd81637c2" path="/var/lib/kubelet/pods/119de702-bd92-49d3-8bef-ba0fd81637c2/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.803046 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cc81af-96c8-4f21-85c5-07c7b9ade605" path="/var/lib/kubelet/pods/21cc81af-96c8-4f21-85c5-07c7b9ade605/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.803735 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23090877-6b52-4bf9-8272-0a3146fb5e70" path="/var/lib/kubelet/pods/23090877-6b52-4bf9-8272-0a3146fb5e70/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.804487 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f568c50-222d-46ec-8b2b-d9605d6ace8a" path="/var/lib/kubelet/pods/4f568c50-222d-46ec-8b2b-d9605d6ace8a/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.806622 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e02e173-17cf-486b-9c4a-b68aa6879f97" path="/var/lib/kubelet/pods/5e02e173-17cf-486b-9c4a-b68aa6879f97/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.808953 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d16c10-6c99-4b18-b515-4a9c18c830b5" path="/var/lib/kubelet/pods/e0d16c10-6c99-4b18-b515-4a9c18c830b5/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.810682 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd5e697-1360-4376-8160-ba0bc7fa56f8" path="/var/lib/kubelet/pods/fcd5e697-1360-4376-8160-ba0bc7fa56f8/volumes" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.834677 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.875044 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.887545 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" (UID: "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: W1216 07:22:11.890749 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod362dcfe9_8417_425b_8eab_8bd39bf661fc.slice/crio-d5b8ab3e0453fede3de057778b21383cdbab6e3c06b304cd26ba5004fde865c7 WatchSource:0}: Error finding container d5b8ab3e0453fede3de057778b21383cdbab6e3c06b304cd26ba5004fde865c7: Status 404 returned error can't find the container with id d5b8ab3e0453fede3de057778b21383cdbab6e3c06b304cd26ba5004fde865c7 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.936786 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-config" (OuterVolumeSpecName: "config") pod "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" (UID: "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.947703 4823 generic.go:334] "Generic (PLEG): container finished" podID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerID="28af7097fe36966795ffd4f08fbf3fc9b6142fd27eb3db8592b4ce75e52927e8" exitCode=0 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.951572 4823 generic.go:334] "Generic (PLEG): container finished" podID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerID="831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d" exitCode=143 Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.967312 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.977957 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.994789 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:11 crc kubenswrapper[4823]: I1216 07:22:11.994982 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.004891 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.005145 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" (UID: "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.005299 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-956hc_efe17b2e-19bd-430b-8cb5-147ed1d2ffb6/openstack-network-exporter/0.log" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.005505 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.020319 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" (UID: "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.020845 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cfd02f05-0804-48c6-b9b4-cda88fd6b14a/ovn-northd/0.log" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.021247 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.023806 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.048940 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b5f6144a-70e4-4772-a8d8-2adf38127212" (UID: "b5f6144a-70e4-4772-a8d8-2adf38127212"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.050665 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="01e5b8f2f03cdaee2d9aa0f7009e062e757b69095af1ac126d2b409afda22307" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.050696 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="6989d85752f4e1b6c7b23a46754686007edf09212f93e356aa9e002490d63f86" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.073083 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_603d469a-39a2-4d84-87cb-f2c7499b7a28/ovsdbserver-sb/0.log" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.073228 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.073834 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-config-data" (OuterVolumeSpecName: "config-data") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.090258 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" (UID: "c4795acd-bc9b-4c2c-aaa2-feb41c3c491f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.095553 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.097464 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.100313 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle\") pod \"acfde95a-b68d-4aee-9302-a81c73eafa99\" (UID: \"acfde95a-b68d-4aee-9302-a81c73eafa99\") " Dec 16 07:22:12 crc kubenswrapper[4823]: W1216 07:22:12.100852 4823 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/acfde95a-b68d-4aee-9302-a81c73eafa99/volumes/kubernetes.io~secret/combined-ca-bundle Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.100956 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: E1216 07:22:12.101097 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:22:12 crc kubenswrapper[4823]: E1216 07:22:12.101157 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data podName:a686a945-8fa0-406c-ac01-cf061c865a28 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:16.101141894 +0000 UTC m=+1614.589708117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data") pod "rabbitmq-server-0" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28") : configmap "rabbitmq-config-data" not found Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101384 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101519 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101603 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101675 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101743 4823 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101810 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101875 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.101943 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f6144a-70e4-4772-a8d8-2adf38127212-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.103210 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "acfde95a-b68d-4aee-9302-a81c73eafa99" (UID: "acfde95a-b68d-4aee-9302-a81c73eafa99"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.108978 4823 generic.go:334] "Generic (PLEG): container finished" podID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerID="4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.109037 4823 generic.go:334] "Generic (PLEG): container finished" podID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerID="d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.109217 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75996b444f-cfsnf" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.150463 4823 generic.go:334] "Generic (PLEG): container finished" podID="dbb285b0-26ce-494d-9d69-8fe905e39469" containerID="dcb6ee461f8c315b99af0cef59bee6ad1bc80844d030f1935cac757ed7544094" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.150572 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.154293 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b566f9ee-8a75-4041-aac4-1573ca610541" (UID: "b566f9ee-8a75-4041-aac4-1573ca610541"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.167903 4823 generic.go:334] "Generic (PLEG): container finished" podID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.191110 4823 generic.go:334] "Generic (PLEG): container finished" podID="0bed5482-3232-4318-b8a0-dcfd51d8611b" containerID="f07f968d96b80dfc9117a3c0164dea559bbcdd8fa65d00f555226983c461f302" exitCode=0 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.204693 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.205011 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfde95a-b68d-4aee-9302-a81c73eafa99-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.205039 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b566f9ee-8a75-4041-aac4-1573ca610541-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266273 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45a2fe80-7cf2-4419-91c9-3c958d33d5a8","Type":"ContainerDied","Data":"28af7097fe36966795ffd4f08fbf3fc9b6142fd27eb3db8592b4ce75e52927e8"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266326 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican5a20-account-delete-f8kwx"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266346 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-99f9cf477-cj5ss" event={"ID":"22db0f3f-88b5-4909-aa80-f4b020d1ce18","Type":"ContainerDied","Data":"831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266360 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b566f9ee-8a75-4041-aac4-1573ca610541","Type":"ContainerDied","Data":"51f23d92b9b14cdf5a284d17abcda28f72a9586a5fe031540b37af42aff48a7c"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266377 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glanceec9f-account-delete-klr92"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266394 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-956hc" event={"ID":"efe17b2e-19bd-430b-8cb5-147ed1d2ffb6","Type":"ContainerDied","Data":"ac1271e00df71a05fa4a2e68ea677a67464683b6651208b8b270e51f046cd15b"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cfd02f05-0804-48c6-b9b4-cda88fd6b14a","Type":"ContainerDied","Data":"251a58e423c3606cf72245339a9084a59e6134e6c468b74fb650a57e0f9ca8e9"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266415 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glanceec9f-account-delete-klr92" event={"ID":"362dcfe9-8417-425b-8eab-8bd39bf661fc","Type":"ContainerStarted","Data":"d5b8ab3e0453fede3de057778b21383cdbab6e3c06b304cd26ba5004fde865c7"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266423 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"01e5b8f2f03cdaee2d9aa0f7009e062e757b69095af1ac126d2b409afda22307"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266434 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"6989d85752f4e1b6c7b23a46754686007edf09212f93e356aa9e002490d63f86"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266443 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"603d469a-39a2-4d84-87cb-f2c7499b7a28","Type":"ContainerDied","Data":"0bf498f98b17c62cf40b0e4da2f105da1f7c94abfdaa144c0064b688a016bc7c"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266453 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" event={"ID":"c4795acd-bc9b-4c2c-aaa2-feb41c3c491f","Type":"ContainerDied","Data":"5ef6f083b75b8234ee95ab33fb173e5b23c5618094d3c17a0fd61db492a224b3"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266466 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75996b444f-cfsnf" event={"ID":"acfde95a-b68d-4aee-9302-a81c73eafa99","Type":"ContainerDied","Data":"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266479 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75996b444f-cfsnf" event={"ID":"acfde95a-b68d-4aee-9302-a81c73eafa99","Type":"ContainerDied","Data":"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266488 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75996b444f-cfsnf" event={"ID":"acfde95a-b68d-4aee-9302-a81c73eafa99","Type":"ContainerDied","Data":"f4d8edf31a44a0c46340fbdc8bc97c9bb07031f9c938b8d86a94a020fa999433"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266496 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5a20-account-delete-f8kwx" event={"ID":"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab","Type":"ContainerStarted","Data":"a8fb5c1e54480b603b47932795e53e6483ea4c9e82b99b01de203609825ebb35"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266504 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbb285b0-26ce-494d-9d69-8fe905e39469","Type":"ContainerDied","Data":"dcb6ee461f8c315b99af0cef59bee6ad1bc80844d030f1935cac757ed7544094"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266515 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerDied","Data":"a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266528 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementf3f4-account-delete-kq7rl" event={"ID":"0bed5482-3232-4318-b8a0-dcfd51d8611b","Type":"ContainerDied","Data":"f07f968d96b80dfc9117a3c0164dea559bbcdd8fa65d00f555226983c461f302"} Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.266545 4823 scope.go:117] "RemoveContainer" containerID="ad6913219d4e64984189276d714fe66372819d7e73a5bd2b7c37eef8a55f9181" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.357745 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.380790 4823 scope.go:117] "RemoveContainer" containerID="58ecdd2ed593319a070e731bb8ab19823f35297f4e564693222927a67dcf6a7b" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408093 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-galera-tls-certs\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408251 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whqlv\" (UniqueName: \"kubernetes.io/projected/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kube-api-access-whqlv\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408336 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-generated\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408396 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-combined-ca-bundle\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408511 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-operator-scripts\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408556 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-default\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408582 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.408637 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kolla-config\") pod \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\" (UID: \"45a2fe80-7cf2-4419-91c9-3c958d33d5a8\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.409398 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.409899 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.413345 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.415212 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.420558 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kube-api-access-whqlv" (OuterVolumeSpecName: "kube-api-access-whqlv") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "kube-api-access-whqlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.421935 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.442767 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.452528 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder1f3e-account-delete-q5pns"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.453556 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.476774 4823 scope.go:117] "RemoveContainer" containerID="f115ec7d425d70b2afcfd5cf1785d9ea4d296e40ca9ff51d30788a90679af605" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.483687 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronba48-account-delete-87d8j"] Dec 16 07:22:12 crc kubenswrapper[4823]: W1216 07:22:12.504132 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a3f54ee_1dba_42f5_8697_b70de0f5b4c2.slice/crio-82294d0918b15b9d66958622c0c2cbf6ca03371d7da85f785b9259c3c2e86a07 WatchSource:0}: Error finding container 82294d0918b15b9d66958622c0c2cbf6ca03371d7da85f785b9259c3c2e86a07: Status 404 returned error can't find the container with id 82294d0918b15b9d66958622c0c2cbf6ca03371d7da85f785b9259c3c2e86a07 Dec 16 07:22:12 crc kubenswrapper[4823]: W1216 07:22:12.506660 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65278526_b5ee_4e40_b66b_1ee9b993f429.slice/crio-c9b587abd6e349be75aa6251d530d3502453499c8ac2414b5493dd65b04180e1 WatchSource:0}: Error finding container c9b587abd6e349be75aa6251d530d3502453499c8ac2414b5493dd65b04180e1: Status 404 returned error can't find the container with id c9b587abd6e349be75aa6251d530d3502453499c8ac2414b5493dd65b04180e1 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.510965 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-vencrypt-tls-certs\") pod \"dbb285b0-26ce-494d-9d69-8fe905e39469\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511107 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-combined-ca-bundle\") pod \"dbb285b0-26ce-494d-9d69-8fe905e39469\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511176 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-nova-novncproxy-tls-certs\") pod \"dbb285b0-26ce-494d-9d69-8fe905e39469\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511206 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5f8g\" (UniqueName: \"kubernetes.io/projected/dbb285b0-26ce-494d-9d69-8fe905e39469-kube-api-access-w5f8g\") pod \"dbb285b0-26ce-494d-9d69-8fe905e39469\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511226 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-config-data\") pod \"dbb285b0-26ce-494d-9d69-8fe905e39469\" (UID: \"dbb285b0-26ce-494d-9d69-8fe905e39469\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511639 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511653 4823 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511664 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whqlv\" (UniqueName: \"kubernetes.io/projected/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-kube-api-access-whqlv\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511673 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511681 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511690 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.511699 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.517159 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "45a2fe80-7cf2-4419-91c9-3c958d33d5a8" (UID: "45a2fe80-7cf2-4419-91c9-3c958d33d5a8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.520545 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapic1ba-account-delete-sldxg"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.521704 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb285b0-26ce-494d-9d69-8fe905e39469-kube-api-access-w5f8g" (OuterVolumeSpecName: "kube-api-access-w5f8g") pod "dbb285b0-26ce-494d-9d69-8fe905e39469" (UID: "dbb285b0-26ce-494d-9d69-8fe905e39469"). InnerVolumeSpecName "kube-api-access-w5f8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.558670 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbb285b0-26ce-494d-9d69-8fe905e39469" (UID: "dbb285b0-26ce-494d-9d69-8fe905e39469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.578005 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.579512 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.581732 4823 scope.go:117] "RemoveContainer" containerID="fd126ddb078fca0b47691ccb775b5689a84f7d7d50e7281488f17b418ac9e03a" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.587753 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.598640 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.606784 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.612100 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.612793 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-scripts\") pod \"a27cd126-6c5b-4e95-b313-0bb19568f42a\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.612824 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5sv5\" (UniqueName: \"kubernetes.io/projected/a27cd126-6c5b-4e95-b313-0bb19568f42a-kube-api-access-k5sv5\") pod \"a27cd126-6c5b-4e95-b313-0bb19568f42a\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.612886 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-combined-ca-bundle\") pod \"a27cd126-6c5b-4e95-b313-0bb19568f42a\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.612985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data-custom\") pod \"a27cd126-6c5b-4e95-b313-0bb19568f42a\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.613009 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data\") pod \"a27cd126-6c5b-4e95-b313-0bb19568f42a\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.613052 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a27cd126-6c5b-4e95-b313-0bb19568f42a-etc-machine-id\") pod \"a27cd126-6c5b-4e95-b313-0bb19568f42a\" (UID: \"a27cd126-6c5b-4e95-b313-0bb19568f42a\") " Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.613436 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5f8g\" (UniqueName: \"kubernetes.io/projected/dbb285b0-26ce-494d-9d69-8fe905e39469-kube-api-access-w5f8g\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.613448 4823 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/45a2fe80-7cf2-4419-91c9-3c958d33d5a8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.613457 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.613521 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a27cd126-6c5b-4e95-b313-0bb19568f42a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a27cd126-6c5b-4e95-b313-0bb19568f42a" (UID: "a27cd126-6c5b-4e95-b313-0bb19568f42a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.619893 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.637605 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-scripts" (OuterVolumeSpecName: "scripts") pod "a27cd126-6c5b-4e95-b313-0bb19568f42a" (UID: "a27cd126-6c5b-4e95-b313-0bb19568f42a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.644991 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.646559 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27cd126-6c5b-4e95-b313-0bb19568f42a-kube-api-access-k5sv5" (OuterVolumeSpecName: "kube-api-access-k5sv5") pod "a27cd126-6c5b-4e95-b313-0bb19568f42a" (UID: "a27cd126-6c5b-4e95-b313-0bb19568f42a"). InnerVolumeSpecName "kube-api-access-k5sv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.649657 4823 scope.go:117] "RemoveContainer" containerID="5f969b423030012c6374edf5d132a7aa122d3b273687a37f08e1b4c115ee2b6a" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.652189 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-config-data" (OuterVolumeSpecName: "config-data") pod "dbb285b0-26ce-494d-9d69-8fe905e39469" (UID: "dbb285b0-26ce-494d-9d69-8fe905e39469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.657106 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a27cd126-6c5b-4e95-b313-0bb19568f42a" (UID: "a27cd126-6c5b-4e95-b313-0bb19568f42a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.678111 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-l8nbv"] Dec 16 07:22:12 crc kubenswrapper[4823]: W1216 07:22:12.683177 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd7efdc_36ba_4037_9f6c_a1a8c946ab33.slice/crio-73eafc561fc77a9630134c39e4984c1ab313fe1246675f93fff1e81ec4c4178c WatchSource:0}: Error finding container 73eafc561fc77a9630134c39e4984c1ab313fe1246675f93fff1e81ec4c4178c: Status 404 returned error can't find the container with id 73eafc561fc77a9630134c39e4984c1ab313fe1246675f93fff1e81ec4c4178c Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.684867 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-l8nbv"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.693139 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-75996b444f-cfsnf"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.696815 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell06c77-account-delete-5jkkk"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.702267 4823 scope.go:117] "RemoveContainer" containerID="c687331eefea963d1e68c44d1eded52992a9e7de45fe0c58d59d647313f5f399" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.705464 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-75996b444f-cfsnf"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.715292 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.715326 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5sv5\" (UniqueName: \"kubernetes.io/projected/a27cd126-6c5b-4e95-b313-0bb19568f42a-kube-api-access-k5sv5\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.715336 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.715345 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a27cd126-6c5b-4e95-b313-0bb19568f42a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.715353 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.715366 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: E1216 07:22:12.715423 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:12 crc kubenswrapper[4823]: E1216 07:22:12.715468 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data podName:cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:16.715454244 +0000 UTC m=+1615.204020367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data") pod "rabbitmq-cell1-server-0" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.740331 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "dbb285b0-26ce-494d-9d69-8fe905e39469" (UID: "dbb285b0-26ce-494d-9d69-8fe905e39469"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.765238 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "dbb285b0-26ce-494d-9d69-8fe905e39469" (UID: "dbb285b0-26ce-494d-9d69-8fe905e39469"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.794368 4823 scope.go:117] "RemoveContainer" containerID="06ecae0f130331b9c70dbb4604848fad60c6fe33be915c08a2e497633d78988f" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.809454 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a27cd126-6c5b-4e95-b313-0bb19568f42a" (UID: "a27cd126-6c5b-4e95-b313-0bb19568f42a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.821822 4823 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.821856 4823 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb285b0-26ce-494d-9d69-8fe905e39469-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.821865 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.834033 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data" (OuterVolumeSpecName: "config-data") pod "a27cd126-6c5b-4e95-b313-0bb19568f42a" (UID: "a27cd126-6c5b-4e95-b313-0bb19568f42a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.919271 4823 scope.go:117] "RemoveContainer" containerID="90bb6f7603a93a35c6ff65c8dd4f67d20079e1d4acfdcadb0ec6ae63addd6404" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.924355 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27cd126-6c5b-4e95-b313-0bb19568f42a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.973876 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.975603 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-central-agent" containerID="cri-o://fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c" gracePeriod=30 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.975751 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="proxy-httpd" containerID="cri-o://42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214" gracePeriod=30 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.975815 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="sg-core" containerID="cri-o://eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813" gracePeriod=30 Dec 16 07:22:12 crc kubenswrapper[4823]: I1216 07:22:12.975868 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-notification-agent" containerID="cri-o://c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294" gracePeriod=30 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.036783 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.037051 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4a9f0e08-d61e-4503-afc5-09cb29ff3175" containerName="kube-state-metrics" containerID="cri-o://48f6096b95361df10996fa9107240728047380521cee4e036be0b67323319318" gracePeriod=30 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.107728 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.107892 4823 scope.go:117] "RemoveContainer" containerID="14f9aa7c5d7c0e6bdf53c979b009b546f44e6652421ca6154616d807431fa6e2" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.108343 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" containerName="memcached" containerID="cri-o://f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba" gracePeriod=30 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.220211 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-np6b5"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.230491 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-np6b5"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.250587 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2jnpg"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.254838 4823 generic.go:334] "Generic (PLEG): container finished" podID="4a9f0e08-d61e-4503-afc5-09cb29ff3175" containerID="48f6096b95361df10996fa9107240728047380521cee4e036be0b67323319318" exitCode=2 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.254937 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a9f0e08-d61e-4503-afc5-09cb29ff3175","Type":"ContainerDied","Data":"48f6096b95361df10996fa9107240728047380521cee4e036be0b67323319318"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.273774 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapic1ba-account-delete-sldxg" event={"ID":"65278526-b5ee-4e40-b66b-1ee9b993f429","Type":"ContainerStarted","Data":"c9b587abd6e349be75aa6251d530d3502453499c8ac2414b5493dd65b04180e1"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.283934 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2jnpg"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.283993 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell06c77-account-delete-5jkkk" event={"ID":"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33","Type":"ContainerStarted","Data":"73eafc561fc77a9630134c39e4984c1ab313fe1246675f93fff1e81ec4c4178c"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.287910 4823 generic.go:334] "Generic (PLEG): container finished" podID="81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" containerID="72c14dbead3689fee64d41c987422c77b20738b49f095e2e65138aaa6b36bf8d" exitCode=0 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.291241 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5a20-account-delete-f8kwx" event={"ID":"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab","Type":"ContainerDied","Data":"72c14dbead3689fee64d41c987422c77b20738b49f095e2e65138aaa6b36bf8d"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.292881 4823 generic.go:334] "Generic (PLEG): container finished" podID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerID="dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5" exitCode=0 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.292974 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a27cd126-6c5b-4e95-b313-0bb19568f42a","Type":"ContainerDied","Data":"dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.293056 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a27cd126-6c5b-4e95-b313-0bb19568f42a","Type":"ContainerDied","Data":"55e3e3c97fe64bb8c1f0e0df7efd5f5006ca6ff0ffd6c2588c464f2071ce4177"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.293078 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.299522 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6c7767d9f4-5rbv6"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.300588 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glanceec9f-account-delete-klr92" event={"ID":"362dcfe9-8417-425b-8eab-8bd39bf661fc","Type":"ContainerDied","Data":"8e4f40865e8b5b6f7f423d358797fe6b396dfe5efe196d02239ab7408b314c84"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.300197 4823 generic.go:334] "Generic (PLEG): container finished" podID="362dcfe9-8417-425b-8eab-8bd39bf661fc" containerID="8e4f40865e8b5b6f7f423d358797fe6b396dfe5efe196d02239ab7408b314c84" exitCode=0 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.302151 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6c7767d9f4-5rbv6" podUID="d7a88b40-28bf-4b43-bed8-0b3df3baec5c" containerName="keystone-api" containerID="cri-o://c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4" gracePeriod=30 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.302755 4823 scope.go:117] "RemoveContainer" containerID="bbb61b83c03517ab496d3469eca7132d7dd7639ebb3875043aeecd6b0de352ca" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.307433 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"45a2fe80-7cf2-4419-91c9-3c958d33d5a8","Type":"ContainerDied","Data":"d649e376b8690bada7045f5b0459236523b60396dcf2c13df59b05b65cdff845"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.307550 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.313377 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": dial tcp 10.217.0.163:8776: connect: connection refused" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.322139 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronba48-account-delete-87d8j" event={"ID":"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2","Type":"ContainerStarted","Data":"82294d0918b15b9d66958622c0c2cbf6ca03371d7da85f785b9259c3c2e86a07"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.328477 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.330560 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbb285b0-26ce-494d-9d69-8fe905e39469","Type":"ContainerDied","Data":"b599852651b94ae9867c04811d750013da43c36522f96160d7a3e0a15baad0ab"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.330666 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.336496 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1f3e-account-delete-q5pns" event={"ID":"ec00a24a-8417-452e-a350-b46f36d4a84d","Type":"ContainerStarted","Data":"fc21714557f1933f1c54e52ba6c8488cf3c0a643887a4313a7800d55d3a72eb1"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.339473 4823 generic.go:334] "Generic (PLEG): container finished" podID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerID="eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813" exitCode=2 Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.339695 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerDied","Data":"eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813"} Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.344381 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-76bqm"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.370299 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-76bqm"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.399179 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-df58-account-create-update-6mv8r"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.409091 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-df58-account-create-update-6mv8r"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.414272 4823 scope.go:117] "RemoveContainer" containerID="4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.437046 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.448350 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.500314 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.505574 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.517452 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.523589 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.797545 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" path="/var/lib/kubelet/pods/45a2fe80-7cf2-4419-91c9-3c958d33d5a8/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.798468 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" path="/var/lib/kubelet/pods/603d469a-39a2-4d84-87cb-f2c7499b7a28/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.799305 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74961679-896e-4f16-a5c3-12708a20a4b1" path="/var/lib/kubelet/pods/74961679-896e-4f16-a5c3-12708a20a4b1/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.800795 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d5ee65e-affe-42fd-af62-724d11efe03d" path="/var/lib/kubelet/pods/8d5ee65e-affe-42fd-af62-724d11efe03d/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.801544 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dca6476-18f2-4c1a-8c95-e894c5f9facd" path="/var/lib/kubelet/pods/9dca6476-18f2-4c1a-8c95-e894c5f9facd/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.802179 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" path="/var/lib/kubelet/pods/a27cd126-6c5b-4e95-b313-0bb19568f42a/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.803603 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" path="/var/lib/kubelet/pods/acfde95a-b68d-4aee-9302-a81c73eafa99/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.804425 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" path="/var/lib/kubelet/pods/b566f9ee-8a75-4041-aac4-1573ca610541/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.805654 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f6144a-70e4-4772-a8d8-2adf38127212" path="/var/lib/kubelet/pods/b5f6144a-70e4-4772-a8d8-2adf38127212/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.806312 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" path="/var/lib/kubelet/pods/c4795acd-bc9b-4c2c-aaa2-feb41c3c491f/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.807124 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" path="/var/lib/kubelet/pods/cfd02f05-0804-48c6-b9b4-cda88fd6b14a/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.808306 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb285b0-26ce-494d-9d69-8fe905e39469" path="/var/lib/kubelet/pods/dbb285b0-26ce-494d-9d69-8fe905e39469/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.808918 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee4f17f-49e7-4f83-b138-a913f67757b3" path="/var/lib/kubelet/pods/dee4f17f-49e7-4f83-b138-a913f67757b3/volumes" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.916643 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-59fd5f5fb-h7tf5" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.148:8778/\": read tcp 10.217.0.2:35416->10.217.0.148:8778: read: connection reset by peer" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.916902 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-59fd5f5fb-h7tf5" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.148:8778/\": read tcp 10.217.0.2:35424->10.217.0.148:8778: read: connection reset by peer" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.991596 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9292/healthcheck\": read tcp 10.217.0.2:49642->10.217.0.170:9292: read: connection reset by peer" Dec 16 07:22:13 crc kubenswrapper[4823]: I1216 07:22:13.992877 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.170:9292/healthcheck\": read tcp 10.217.0.2:49646->10.217.0.170:9292: read: connection reset by peer" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.014595 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dbq9q"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.050699 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6456ccccf4-rhnf4" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:43928->10.217.0.158:9311: read: connection reset by peer" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.050823 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6456ccccf4-rhnf4" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:43918->10.217.0.158:9311: read: connection reset by peer" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.054969 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dbq9q"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.096187 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5a20-account-create-update-42fv4"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.166137 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5a20-account-create-update-42fv4"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.179273 4823 scope.go:117] "RemoveContainer" containerID="d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.198120 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5a20-account-delete-f8kwx"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.209062 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dxgzr"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.220161 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dxgzr"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.228367 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f3f4-account-create-update-tlhkf"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.241531 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementf3f4-account-delete-kq7rl"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.251126 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f3f4-account-create-update-tlhkf"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.274470 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7g2pq"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.281728 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7g2pq"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.302902 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glanceec9f-account-delete-klr92"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.321376 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ec9f-account-create-update-c7mql"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.329112 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ec9f-account-create-update-c7mql"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.339875 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zw7xm"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.347211 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zw7xm"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.358921 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerName="galera" containerID="cri-o://310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede" gracePeriod=29 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.371047 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapic1ba-account-delete-sldxg"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.376879 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1f3e-account-delete-q5pns" event={"ID":"ec00a24a-8417-452e-a350-b46f36d4a84d","Type":"ContainerStarted","Data":"d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.377802 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder1f3e-account-delete-q5pns" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.395433 4823 generic.go:334] "Generic (PLEG): container finished" podID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerID="42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.395460 4823 generic.go:334] "Generic (PLEG): container finished" podID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerID="fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.395506 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerDied","Data":"42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.395538 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerDied","Data":"fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.399264 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c1ba-account-create-update-br8dd"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.419813 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c1ba-account-create-update-br8dd"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.423984 4823 generic.go:334] "Generic (PLEG): container finished" podID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerID="7c64a48ce42e0fd4916be0a094d9954ddcea66b005eeff168ca0a4dec1eb2cff" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.424595 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec","Type":"ContainerDied","Data":"7c64a48ce42e0fd4916be0a094d9954ddcea66b005eeff168ca0a4dec1eb2cff"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.440210 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3d6c697-a49c-4919-81b5-6899a080d06b","Type":"ContainerDied","Data":"f04505b3b1dfe8b6dfd28ec3fadb6fe3ba712cf0ca0ed6cc257b567eb1c5714b"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.440345 4823 generic.go:334] "Generic (PLEG): container finished" podID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerID="f04505b3b1dfe8b6dfd28ec3fadb6fe3ba712cf0ca0ed6cc257b567eb1c5714b" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.442440 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f74gj"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.452332 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapic1ba-account-delete-sldxg" event={"ID":"65278526-b5ee-4e40-b66b-1ee9b993f429","Type":"ContainerStarted","Data":"329a04c4c9ff60b70d5395a727c15217c7dff6014bc04044a76975b137573df3"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.452714 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapic1ba-account-delete-sldxg" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.457859 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutronba48-account-delete-87d8j" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.458054 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronba48-account-delete-87d8j" event={"ID":"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2","Type":"ContainerStarted","Data":"7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.482271 4823 generic.go:334] "Generic (PLEG): container finished" podID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerID="9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.482351 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50","Type":"ContainerDied","Data":"9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71"} Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.486165 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.486214 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:22:14.986201534 +0000 UTC m=+1613.474767657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.488186 4823 generic.go:334] "Generic (PLEG): container finished" podID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerID="fa7ad139671c8c3444b9e62aff507fb0fc6b2d2d087722f71ba9f8cc7977708c" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.488243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd5f5fb-h7tf5" event={"ID":"196356f3-e866-4cf1-b3e8-eba3d9e4c99f","Type":"ContainerDied","Data":"fa7ad139671c8c3444b9e62aff507fb0fc6b2d2d087722f71ba9f8cc7977708c"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.508101 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell06c77-account-delete-5jkkk" event={"ID":"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33","Type":"ContainerStarted","Data":"2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.508688 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell06c77-account-delete-5jkkk" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.520483 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f74gj"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.528505 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc559ee21_de8f_44a1_998a_cb0b4aff8cd7.slice/crio-conmon-530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77e933eb_7294_47b8_af0c_fbb03725d3d8.slice/crio-fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc559ee21_de8f_44a1_998a_cb0b4aff8cd7.slice/crio-530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbee1863_ef4e_4d0a_aca7_f7c09e3f0a50.slice/crio-conmon-9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbee1863_ef4e_4d0a_aca7_f7c09e3f0a50.slice/crio-9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77e933eb_7294_47b8_af0c_fbb03725d3d8.slice/crio-conmon-42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d2faec4_82e9_409b_a6c1_93f8cd78b9ec.slice/crio-conmon-7c64a48ce42e0fd4916be0a094d9954ddcea66b005eeff168ca0a4dec1eb2cff.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.535314 4823 generic.go:334] "Generic (PLEG): container finished" podID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerID="a2e711057ef9e93e470930a37179c721716096884ec2356c0cc2c2d27a2dddf4" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.535410 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17cbb31a-6067-4925-ba57-956baf53ce8b","Type":"ContainerDied","Data":"a2e711057ef9e93e470930a37179c721716096884ec2356c0cc2c2d27a2dddf4"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.547501 4823 generic.go:334] "Generic (PLEG): container finished" podID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerID="a3797feae0da2f46b99e7827ab8d4f11114590dcdde7cc7247a8b58f538e9505" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.547579 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925","Type":"ContainerDied","Data":"a3797feae0da2f46b99e7827ab8d4f11114590dcdde7cc7247a8b58f538e9505"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.555774 4823 generic.go:334] "Generic (PLEG): container finished" podID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerID="530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec" exitCode=0 Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.556098 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6456ccccf4-rhnf4" event={"ID":"c559ee21-de8f-44a1-998a-cb0b4aff8cd7","Type":"ContainerDied","Data":"530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec"} Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.579311 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell06c77-account-delete-5jkkk"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.585136 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6c77-account-create-update-dktr4"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.588402 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.588462 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:15.088444326 +0000 UTC m=+1613.577010449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.588726 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.588764 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:15.088752156 +0000 UTC m=+1613.577318279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.589470 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.589503 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:15.089493109 +0000 UTC m=+1613.578059232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.594608 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.594974 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6c77-account-create-update-dktr4"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.600749 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8fktl"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.612835 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.618537 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8fktl"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.634243 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1f3e-account-create-update-qtwg7"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.643080 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder1f3e-account-delete-q5pns"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.665883 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1f3e-account-create-update-qtwg7"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.679110 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dqg6x"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.686242 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dqg6x"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689100 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fcs\" (UniqueName: \"kubernetes.io/projected/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-api-access-n9fcs\") pod \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689151 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-certs\") pod \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689245 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-config\") pod \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689267 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcbhz\" (UniqueName: \"kubernetes.io/projected/0bed5482-3232-4318-b8a0-dcfd51d8611b-kube-api-access-gcbhz\") pod \"0bed5482-3232-4318-b8a0-dcfd51d8611b\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689318 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bed5482-3232-4318-b8a0-dcfd51d8611b-operator-scripts\") pod \"0bed5482-3232-4318-b8a0-dcfd51d8611b\" (UID: \"0bed5482-3232-4318-b8a0-dcfd51d8611b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689352 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-combined-ca-bundle\") pod \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\" (UID: \"4a9f0e08-d61e-4503-afc5-09cb29ff3175\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.689926 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bed5482-3232-4318-b8a0-dcfd51d8611b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bed5482-3232-4318-b8a0-dcfd51d8611b" (UID: "0bed5482-3232-4318-b8a0-dcfd51d8611b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.702296 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ba48-account-create-update-tk44m"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.707891 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ba48-account-create-update-tk44m"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.708493 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bed5482-3232-4318-b8a0-dcfd51d8611b-kube-api-access-gcbhz" (OuterVolumeSpecName: "kube-api-access-gcbhz") pod "0bed5482-3232-4318-b8a0-dcfd51d8611b" (UID: "0bed5482-3232-4318-b8a0-dcfd51d8611b"). InnerVolumeSpecName "kube-api-access-gcbhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.711162 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.717267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-api-access-n9fcs" (OuterVolumeSpecName: "kube-api-access-n9fcs") pod "4a9f0e08-d61e-4503-afc5-09cb29ff3175" (UID: "4a9f0e08-d61e-4503-afc5-09cb29ff3175"). InnerVolumeSpecName "kube-api-access-n9fcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.717715 4823 scope.go:117] "RemoveContainer" containerID="4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386" Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.722712 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386\": container with ID starting with 4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386 not found: ID does not exist" containerID="4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.722743 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386"} err="failed to get container status \"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386\": rpc error: code = NotFound desc = could not find container \"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386\": container with ID starting with 4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386 not found: ID does not exist" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.722763 4823 scope.go:117] "RemoveContainer" containerID="d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d" Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.725109 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d\": container with ID starting with d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d not found: ID does not exist" containerID="d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.725142 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d"} err="failed to get container status \"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d\": rpc error: code = NotFound desc = could not find container \"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d\": container with ID starting with d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d not found: ID does not exist" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.725158 4823 scope.go:117] "RemoveContainer" containerID="4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.728692 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386"} err="failed to get container status \"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386\": rpc error: code = NotFound desc = could not find container \"4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386\": container with ID starting with 4c4e79f2a5dd3e53e86fd303d293f08e7a4df7dc0b54cdda4b91bc74df4c3386 not found: ID does not exist" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.728722 4823 scope.go:117] "RemoveContainer" containerID="d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.732337 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronba48-account-delete-87d8j"] Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.742231 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d"} err="failed to get container status \"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d\": rpc error: code = NotFound desc = could not find container \"d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d\": container with ID starting with d25459bc894939d2fcabcc5640a5016ecc71457b4cdaf7962569b58c6456358d not found: ID does not exist" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.742273 4823 scope.go:117] "RemoveContainer" containerID="457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.746350 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder1f3e-account-delete-q5pns" podStartSLOduration=5.746332132 podStartE2EDuration="5.746332132s" podCreationTimestamp="2025-12-16 07:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:22:14.391735675 +0000 UTC m=+1612.880301808" watchObservedRunningTime="2025-12-16 07:22:14.746332132 +0000 UTC m=+1613.234898245" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.752279 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapic1ba-account-delete-sldxg" podStartSLOduration=6.752265368 podStartE2EDuration="6.752265368s" podCreationTimestamp="2025-12-16 07:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:22:14.470945566 +0000 UTC m=+1612.959511689" watchObservedRunningTime="2025-12-16 07:22:14.752265368 +0000 UTC m=+1613.240831491" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.758157 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9f0e08-d61e-4503-afc5-09cb29ff3175" (UID: "4a9f0e08-d61e-4503-afc5-09cb29ff3175"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.760302 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutronba48-account-delete-87d8j" podStartSLOduration=5.760288819 podStartE2EDuration="5.760288819s" podCreationTimestamp="2025-12-16 07:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:22:14.487918138 +0000 UTC m=+1612.976484261" watchObservedRunningTime="2025-12-16 07:22:14.760288819 +0000 UTC m=+1613.248854942" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.761158 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "4a9f0e08-d61e-4503-afc5-09cb29ff3175" (UID: "4a9f0e08-d61e-4503-afc5-09cb29ff3175"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.768281 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell06c77-account-delete-5jkkk" podStartSLOduration=5.768264488 podStartE2EDuration="5.768264488s" podCreationTimestamp="2025-12-16 07:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:22:14.528165509 +0000 UTC m=+1613.016731632" watchObservedRunningTime="2025-12-16 07:22:14.768264488 +0000 UTC m=+1613.256830611" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.769329 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "4a9f0e08-d61e-4503-afc5-09cb29ff3175" (UID: "4a9f0e08-d61e-4503-afc5-09cb29ff3175"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.782435 4823 scope.go:117] "RemoveContainer" containerID="dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.787357 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fvqqp" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" probeResult="failure" output="command timed out" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.791825 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-public-tls-certs\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.791885 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data-custom\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.791972 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-scripts\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.791996 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17cbb31a-6067-4925-ba57-956baf53ce8b-etc-machine-id\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.792391 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17cbb31a-6067-4925-ba57-956baf53ce8b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.792481 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.792514 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtk9t\" (UniqueName: \"kubernetes.io/projected/17cbb31a-6067-4925-ba57-956baf53ce8b-kube-api-access-vtk9t\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.793438 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-internal-tls-certs\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.794197 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17cbb31a-6067-4925-ba57-956baf53ce8b-logs\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.794226 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-combined-ca-bundle\") pod \"17cbb31a-6067-4925-ba57-956baf53ce8b\" (UID: \"17cbb31a-6067-4925-ba57-956baf53ce8b\") " Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.794985 4823 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.794999 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcbhz\" (UniqueName: \"kubernetes.io/projected/0bed5482-3232-4318-b8a0-dcfd51d8611b-kube-api-access-gcbhz\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.795008 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17cbb31a-6067-4925-ba57-956baf53ce8b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.795031 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bed5482-3232-4318-b8a0-dcfd51d8611b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.795041 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.795049 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fcs\" (UniqueName: \"kubernetes.io/projected/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-api-access-n9fcs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.795059 4823 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9f0e08-d61e-4503-afc5-09cb29ff3175-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.797371 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17cbb31a-6067-4925-ba57-956baf53ce8b-logs" (OuterVolumeSpecName: "logs") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.806275 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17cbb31a-6067-4925-ba57-956baf53ce8b-kube-api-access-vtk9t" (OuterVolumeSpecName: "kube-api-access-vtk9t") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "kube-api-access-vtk9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.806358 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.810302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-scripts" (OuterVolumeSpecName: "scripts") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.819623 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.839854 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.849205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data" (OuterVolumeSpecName: "config-data") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.869343 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "17cbb31a-6067-4925-ba57-956baf53ce8b" (UID: "17cbb31a-6067-4925-ba57-956baf53ce8b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896357 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896386 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17cbb31a-6067-4925-ba57-956baf53ce8b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896395 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896404 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896412 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896420 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896429 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17cbb31a-6067-4925-ba57-956baf53ce8b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.896437 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtk9t\" (UniqueName: \"kubernetes.io/projected/17cbb31a-6067-4925-ba57-956baf53ce8b-kube-api-access-vtk9t\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:14 crc kubenswrapper[4823]: I1216 07:22:14.925098 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fvqqp" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" probeResult="failure" output=< Dec 16 07:22:14 crc kubenswrapper[4823]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Dec 16 07:22:14 crc kubenswrapper[4823]: > Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.948434 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.948542 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.949136 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.950399 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.962204 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.962367 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.962292 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:14 crc kubenswrapper[4823]: E1216 07:22:14.966069 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.000154 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.000300 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:22:16.000279245 +0000 UTC m=+1614.488845378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.037398 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.065149 4823 scope.go:117] "RemoveContainer" containerID="457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02" Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.066488 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02\": container with ID starting with 457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02 not found: ID does not exist" containerID="457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.066536 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02"} err="failed to get container status \"457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02\": rpc error: code = NotFound desc = could not find container \"457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02\": container with ID starting with 457377de0d8e4d6837606a566ccbe412c1ad0e48f0692027311c6646fc5a9d02 not found: ID does not exist" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.066568 4823 scope.go:117] "RemoveContainer" containerID="dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5" Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.066994 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5\": container with ID starting with dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5 not found: ID does not exist" containerID="dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.067046 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5"} err="failed to get container status \"dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5\": rpc error: code = NotFound desc = could not find container \"dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5\": container with ID starting with dec8a740e5a159ade11e7e1e6846443afc7b4cf141676ae0fa16ecdefdc7efc5 not found: ID does not exist" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.067063 4823 scope.go:117] "RemoveContainer" containerID="28af7097fe36966795ffd4f08fbf3fc9b6142fd27eb3db8592b4ce75e52927e8" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.070256 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.075741 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.080219 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108163 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-public-tls-certs\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108218 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-scripts\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108273 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-combined-ca-bundle\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108316 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-internal-tls-certs\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108348 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-logs\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108396 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-config-data\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.108590 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bdn4\" (UniqueName: \"kubernetes.io/projected/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-kube-api-access-4bdn4\") pod \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\" (UID: \"196356f3-e866-4cf1-b3e8-eba3d9e4c99f\") " Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.109094 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.109143 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:16.109130784 +0000 UTC m=+1614.597696907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.109256 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.109295 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:16.109284639 +0000 UTC m=+1614.597850762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.111421 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.111464 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:16.111452497 +0000 UTC m=+1614.600018620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.111891 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-logs" (OuterVolumeSpecName: "logs") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.112322 4823 scope.go:117] "RemoveContainer" containerID="7fdc30c61c04e114057f4a9e2d6e7879f0b0f75c9c3a5cf2549057bada61a9f0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.114518 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-kube-api-access-4bdn4" (OuterVolumeSpecName: "kube-api-access-4bdn4") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "kube-api-access-4bdn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.115521 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.128667 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.140417 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-scripts" (OuterVolumeSpecName: "scripts") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.144284 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.144356 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" containerName="nova-cell0-conductor-conductor" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.215735 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-logs\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.215988 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-httpd-run\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.217047 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.217441 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-logs" (OuterVolumeSpecName: "logs") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227206 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzzmh\" (UniqueName: \"kubernetes.io/projected/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-kube-api-access-qzzmh\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227263 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-combined-ca-bundle\") pod \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227302 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-internal-tls-certs\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227355 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-combined-ca-bundle\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227377 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-internal-tls-certs\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227443 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb6nm\" (UniqueName: \"kubernetes.io/projected/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-kube-api-access-fb6nm\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227478 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-scripts\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227505 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-config-data\") pod \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227540 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzpxq\" (UniqueName: \"kubernetes.io/projected/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-kube-api-access-rzpxq\") pod \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227581 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227641 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-combined-ca-bundle\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227664 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-public-tls-certs\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227719 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227751 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-nova-metadata-tls-certs\") pod \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227782 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-logs\") pod \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\" (UID: \"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227821 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data-custom\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227849 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-config-data\") pod \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\" (UID: \"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.227911 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-logs\") pod \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\" (UID: \"c559ee21-de8f-44a1-998a-cb0b4aff8cd7\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.228687 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-logs" (OuterVolumeSpecName: "logs") pod "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" (UID: "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.228708 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.228727 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.228741 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bdn4\" (UniqueName: \"kubernetes.io/projected/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-kube-api-access-4bdn4\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.228752 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.228763 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.229304 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-logs" (OuterVolumeSpecName: "logs") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.239613 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-kube-api-access-rzpxq" (OuterVolumeSpecName: "kube-api-access-rzpxq") pod "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" (UID: "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec"). InnerVolumeSpecName "kube-api-access-rzpxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.254124 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-kube-api-access-fb6nm" (OuterVolumeSpecName: "kube-api-access-fb6nm") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "kube-api-access-fb6nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.255317 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.255688 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-kube-api-access-qzzmh" (OuterVolumeSpecName: "kube-api-access-qzzmh") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "kube-api-access-qzzmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.255983 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.256176 4823 scope.go:117] "RemoveContainer" containerID="dcb6ee461f8c315b99af0cef59bee6ad1bc80844d030f1935cac757ed7544094" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.257560 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-config-data" (OuterVolumeSpecName: "config-data") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.257854 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-scripts" (OuterVolumeSpecName: "scripts") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.293871 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.300875 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.306859 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.322434 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331193 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331227 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331265 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331281 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331293 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzzmh\" (UniqueName: \"kubernetes.io/projected/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-kube-api-access-qzzmh\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331301 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb6nm\" (UniqueName: \"kubernetes.io/projected/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-kube-api-access-fb6nm\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331310 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331317 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzpxq\" (UniqueName: \"kubernetes.io/projected/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-kube-api-access-rzpxq\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.331343 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.380629 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.196:3000/\": dial tcp 10.217.0.196:3000: connect: connection refused" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.396841 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.404213 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.416423 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.432201 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data" (OuterVolumeSpecName: "config-data") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.441934 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-config-data\") pod \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442013 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-combined-ca-bundle\") pod \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442140 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442182 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-memcached-tls-certs\") pod \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442241 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-logs\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442311 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-operator-scripts\") pod \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442343 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd8pk\" (UniqueName: \"kubernetes.io/projected/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kube-api-access-qd8pk\") pod \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442389 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-public-tls-certs\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442417 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msrst\" (UniqueName: \"kubernetes.io/projected/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-kube-api-access-msrst\") pod \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\" (UID: \"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442452 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmsn8\" (UniqueName: \"kubernetes.io/projected/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-kube-api-access-zmsn8\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442479 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kolla-config\") pod \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\" (UID: \"3eee92de-9c0e-4afd-8a27-52d82caa27ad\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442506 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-httpd-run\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442543 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d6b8\" (UniqueName: \"kubernetes.io/projected/362dcfe9-8417-425b-8eab-8bd39bf661fc-kube-api-access-7d6b8\") pod \"362dcfe9-8417-425b-8eab-8bd39bf661fc\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442585 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-scripts\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442626 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362dcfe9-8417-425b-8eab-8bd39bf661fc-operator-scripts\") pod \"362dcfe9-8417-425b-8eab-8bd39bf661fc\" (UID: \"362dcfe9-8417-425b-8eab-8bd39bf661fc\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442657 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-combined-ca-bundle\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.442727 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-config-data\") pod \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\" (UID: \"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.443809 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.443845 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.443860 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.446187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" (UID: "81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.447267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-logs" (OuterVolumeSpecName: "logs") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.449013 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.450017 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362dcfe9-8417-425b-8eab-8bd39bf661fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "362dcfe9-8417-425b-8eab-8bd39bf661fc" (UID: "362dcfe9-8417-425b-8eab-8bd39bf661fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.453642 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3eee92de-9c0e-4afd-8a27-52d82caa27ad" (UID: "3eee92de-9c0e-4afd-8a27-52d82caa27ad"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.455478 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-config-data" (OuterVolumeSpecName: "config-data") pod "3eee92de-9c0e-4afd-8a27-52d82caa27ad" (UID: "3eee92de-9c0e-4afd-8a27-52d82caa27ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.456116 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.464216 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-scripts" (OuterVolumeSpecName: "scripts") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.465300 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362dcfe9-8417-425b-8eab-8bd39bf661fc-kube-api-access-7d6b8" (OuterVolumeSpecName: "kube-api-access-7d6b8") pod "362dcfe9-8417-425b-8eab-8bd39bf661fc" (UID: "362dcfe9-8417-425b-8eab-8bd39bf661fc"). InnerVolumeSpecName "kube-api-access-7d6b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.473219 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.473654 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kube-api-access-qd8pk" (OuterVolumeSpecName: "kube-api-access-qd8pk") pod "3eee92de-9c0e-4afd-8a27-52d82caa27ad" (UID: "3eee92de-9c0e-4afd-8a27-52d82caa27ad"). InnerVolumeSpecName "kube-api-access-qd8pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.474190 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-config-data" (OuterVolumeSpecName: "config-data") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.482071 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-kube-api-access-msrst" (OuterVolumeSpecName: "kube-api-access-msrst") pod "81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" (UID: "81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab"). InnerVolumeSpecName "kube-api-access-msrst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.484361 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.484727 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.489811 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-kube-api-access-zmsn8" (OuterVolumeSpecName: "kube-api-access-zmsn8") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "kube-api-access-zmsn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.505875 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" (UID: "bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.514791 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" (UID: "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.528185 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" (UID: "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.529243 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.539489 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c559ee21-de8f-44a1-998a-cb0b4aff8cd7" (UID: "c559ee21-de8f-44a1-998a-cb0b4aff8cd7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.544859 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d6c697-a49c-4919-81b5-6899a080d06b-logs\") pod \"d3d6c697-a49c-4919-81b5-6899a080d06b\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.545144 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-public-tls-certs\") pod \"d3d6c697-a49c-4919-81b5-6899a080d06b\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.545219 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-internal-tls-certs\") pod \"d3d6c697-a49c-4919-81b5-6899a080d06b\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.545242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82x4g\" (UniqueName: \"kubernetes.io/projected/d3d6c697-a49c-4919-81b5-6899a080d06b-kube-api-access-82x4g\") pod \"d3d6c697-a49c-4919-81b5-6899a080d06b\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.545268 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-config-data\") pod \"d3d6c697-a49c-4919-81b5-6899a080d06b\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.545290 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-combined-ca-bundle\") pod \"d3d6c697-a49c-4919-81b5-6899a080d06b\" (UID: \"d3d6c697-a49c-4919-81b5-6899a080d06b\") " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.545448 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d6c697-a49c-4919-81b5-6899a080d06b-logs" (OuterVolumeSpecName: "logs") pod "d3d6c697-a49c-4919-81b5-6899a080d06b" (UID: "d3d6c697-a49c-4919-81b5-6899a080d06b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547298 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmsn8\" (UniqueName: \"kubernetes.io/projected/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-kube-api-access-zmsn8\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547325 4823 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547339 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547352 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547363 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d6b8\" (UniqueName: \"kubernetes.io/projected/362dcfe9-8417-425b-8eab-8bd39bf661fc-kube-api-access-7d6b8\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547374 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547388 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362dcfe9-8417-425b-8eab-8bd39bf661fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547399 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547410 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547422 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547432 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547443 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3eee92de-9c0e-4afd-8a27-52d82caa27ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547453 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547464 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547487 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547499 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c559ee21-de8f-44a1-998a-cb0b4aff8cd7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547510 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547521 4823 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547533 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547545 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd8pk\" (UniqueName: \"kubernetes.io/projected/3eee92de-9c0e-4afd-8a27-52d82caa27ad-kube-api-access-qd8pk\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547555 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d6c697-a49c-4919-81b5-6899a080d06b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.547566 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msrst\" (UniqueName: \"kubernetes.io/projected/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab-kube-api-access-msrst\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.549709 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-config-data" (OuterVolumeSpecName: "config-data") pod "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" (UID: "6d2faec4-82e9-409b-a6c1-93f8cd78b9ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.550187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d6c697-a49c-4919-81b5-6899a080d06b-kube-api-access-82x4g" (OuterVolumeSpecName: "kube-api-access-82x4g") pod "d3d6c697-a49c-4919-81b5-6899a080d06b" (UID: "d3d6c697-a49c-4919-81b5-6899a080d06b"). InnerVolumeSpecName "kube-api-access-82x4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.572231 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.573862 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eee92de-9c0e-4afd-8a27-52d82caa27ad" (UID: "3eee92de-9c0e-4afd-8a27-52d82caa27ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.574137 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d2faec4-82e9-409b-a6c1-93f8cd78b9ec","Type":"ContainerDied","Data":"47e968bf3ad64e6c9679d57265619c43ab3f2db861c24ec3d5b4f2967fa690f1"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.574192 4823 scope.go:117] "RemoveContainer" containerID="7c64a48ce42e0fd4916be0a094d9954ddcea66b005eeff168ca0a4dec1eb2cff" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.574799 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.577757 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3d6c697-a49c-4919-81b5-6899a080d06b","Type":"ContainerDied","Data":"f6ec12bad23c8b3b31196727590c5f21c3001bb444f9eb1d2bcbda95f78a69e3"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.577914 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.580395 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5a20-account-delete-f8kwx" event={"ID":"81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab","Type":"ContainerDied","Data":"a8fb5c1e54480b603b47932795e53e6483ea4c9e82b99b01de203609825ebb35"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.580431 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8fb5c1e54480b603b47932795e53e6483ea4c9e82b99b01de203609825ebb35" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.580493 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5a20-account-delete-f8kwx" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.589300 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17cbb31a-6067-4925-ba57-956baf53ce8b","Type":"ContainerDied","Data":"7d28597c75c0d7c63dc80fe3f8ba2359f5560ee9f5ba2768db65afd9ec3f19c7"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.589614 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.601117 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a9f0e08-d61e-4503-afc5-09cb29ff3175","Type":"ContainerDied","Data":"ddece497f262c1e5208bb1692a45a5f74d43ab4d9560423c5e712470e3e5818e"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.601702 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.609624 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925","Type":"ContainerDied","Data":"1aad62c97e347c1cb323d949097e8cf2b4fd9c0df9bffef7fcb5c5eb54fa2f65"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.609726 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.618465 4823 scope.go:117] "RemoveContainer" containerID="3364c619253b1feab519f7ec3af4216d4032c2f42e27c3ea18c8f718e361769b" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.619259 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6456ccccf4-rhnf4" event={"ID":"c559ee21-de8f-44a1-998a-cb0b4aff8cd7","Type":"ContainerDied","Data":"021fd354a518969266cacdeeef782252068339aeb8870177816a95bed2decbec"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.623330 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5a20-account-delete-f8kwx"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.625733 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glanceec9f-account-delete-klr92" event={"ID":"362dcfe9-8417-425b-8eab-8bd39bf661fc","Type":"ContainerDied","Data":"d5b8ab3e0453fede3de057778b21383cdbab6e3c06b304cd26ba5004fde865c7"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.626732 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5b8ab3e0453fede3de057778b21383cdbab6e3c06b304cd26ba5004fde865c7" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.626515 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glanceec9f-account-delete-klr92" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.627724 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-config-data" (OuterVolumeSpecName: "config-data") pod "d3d6c697-a49c-4919-81b5-6899a080d06b" (UID: "d3d6c697-a49c-4919-81b5-6899a080d06b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.633038 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50","Type":"ContainerDied","Data":"eb855cdc74329a8ae82cfe1b766eadc8b371fcfc4c6784324b0352ca09302388"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.633148 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.629011 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6456ccccf4-rhnf4" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.647983 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59fd5f5fb-h7tf5" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.648068 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd5f5fb-h7tf5" event={"ID":"196356f3-e866-4cf1-b3e8-eba3d9e4c99f","Type":"ContainerDied","Data":"541126e09e93db247581ec589e02c3df986338da5e0953de66629883930267f7"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.648886 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.648914 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.648924 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.648933 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82x4g\" (UniqueName: \"kubernetes.io/projected/d3d6c697-a49c-4919-81b5-6899a080d06b-kube-api-access-82x4g\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.648943 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.651483 4823 scope.go:117] "RemoveContainer" containerID="f04505b3b1dfe8b6dfd28ec3fadb6fe3ba712cf0ca0ed6cc257b567eb1c5714b" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.655347 4823 generic.go:334] "Generic (PLEG): container finished" podID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" containerID="f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba" exitCode=0 Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.655480 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.655728 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3eee92de-9c0e-4afd-8a27-52d82caa27ad","Type":"ContainerDied","Data":"f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.655805 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3eee92de-9c0e-4afd-8a27-52d82caa27ad","Type":"ContainerDied","Data":"b1b1b327a28624e923bddebeefdae9b7ba095e1e0f973a89b6756076f00dfaef"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.660194 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementf3f4-account-delete-kq7rl" event={"ID":"0bed5482-3232-4318-b8a0-dcfd51d8611b","Type":"ContainerDied","Data":"83d52232f6a5bf5bea1ad6fb6f6555d9c4a68c8bc8cbded87638ec5ec23f8f1e"} Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.660435 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83d52232f6a5bf5bea1ad6fb6f6555d9c4a68c8bc8cbded87638ec5ec23f8f1e" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.660721 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementf3f4-account-delete-kq7rl" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.663124 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutronba48-account-delete-87d8j" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.663248 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapic1ba-account-delete-sldxg" podUID="65278526-b5ee-4e40-b66b-1ee9b993f429" containerName="mariadb-account-delete" containerID="cri-o://329a04c4c9ff60b70d5395a727c15217c7dff6014bc04044a76975b137573df3" gracePeriod=30 Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.664009 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder1f3e-account-delete-q5pns" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.664056 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3d6c697-a49c-4919-81b5-6899a080d06b" (UID: "d3d6c697-a49c-4919-81b5-6899a080d06b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.665841 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.671127 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican5a20-account-delete-f8kwx"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.694677 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "3eee92de-9c0e-4afd-8a27-52d82caa27ad" (UID: "3eee92de-9c0e-4afd-8a27-52d82caa27ad"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.701256 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-config-data" (OuterVolumeSpecName: "config-data") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.713315 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell06c77-account-delete-5jkkk" secret="" err="secret \"galera-openstack-dockercfg-ncfcv\" not found" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.719154 4823 scope.go:117] "RemoveContainer" containerID="f33c995e4b22b44c31a3cf7f028d6d43a1e215de8f4963b068a6b9ffc12fa049" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.719227 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.725708 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" (UID: "dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.728886 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.729348 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d3d6c697-a49c-4919-81b5-6899a080d06b" (UID: "d3d6c697-a49c-4919-81b5-6899a080d06b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.737120 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.742057 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d3d6c697-a49c-4919-81b5-6899a080d06b" (UID: "d3d6c697-a49c-4919-81b5-6899a080d06b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.742121 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.747543 4823 scope.go:117] "RemoveContainer" containerID="a2e711057ef9e93e470930a37179c721716096884ec2356c0cc2c2d27a2dddf4" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.749863 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751165 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751199 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751213 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751226 4823 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eee92de-9c0e-4afd-8a27-52d82caa27ad-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751238 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751250 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d6c697-a49c-4919-81b5-6899a080d06b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.751262 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.753325 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.755267 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.757067 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:22:15 crc kubenswrapper[4823]: E1216 07:22:15.757127 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" containerName="nova-cell1-conductor-conductor" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.757853 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.762835 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6456ccccf4-rhnf4"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.768297 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "196356f3-e866-4cf1-b3e8-eba3d9e4c99f" (UID: "196356f3-e866-4cf1-b3e8-eba3d9e4c99f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.768359 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6456ccccf4-rhnf4"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.774714 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glanceec9f-account-delete-klr92"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.790240 4823 scope.go:117] "RemoveContainer" containerID="51565ca562af3db8782b4b38fb1d3b09a6c7f19f5c5020ef8e0d0b046831c28d" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.794776 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ea4a50-20c1-4954-8438-520ce44b72a4" path="/var/lib/kubelet/pods/02ea4a50-20c1-4954-8438-520ce44b72a4/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.795455 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" path="/var/lib/kubelet/pods/17cbb31a-6067-4925-ba57-956baf53ce8b/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.796053 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244ca852-a6d0-4537-8f87-b52b1237ff9b" path="/var/lib/kubelet/pods/244ca852-a6d0-4537-8f87-b52b1237ff9b/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.796572 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa17caf-7d1d-4094-9334-453fe242229e" path="/var/lib/kubelet/pods/2aa17caf-7d1d-4094-9334-453fe242229e/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.798085 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dec2d9f-40dd-4b15-ab4f-529a346e7857" path="/var/lib/kubelet/pods/2dec2d9f-40dd-4b15-ab4f-529a346e7857/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.798551 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c508895-4490-426b-95d4-47b5a2e871e9" path="/var/lib/kubelet/pods/3c508895-4490-426b-95d4-47b5a2e871e9/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.799002 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490618c1-e6b6-4546-86ea-27cf18723a7a" path="/var/lib/kubelet/pods/490618c1-e6b6-4546-86ea-27cf18723a7a/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.799981 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9f0e08-d61e-4503-afc5-09cb29ff3175" path="/var/lib/kubelet/pods/4a9f0e08-d61e-4503-afc5-09cb29ff3175/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.800553 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5cb375-fe40-481c-a59e-a0f2ae2322bc" path="/var/lib/kubelet/pods/4c5cb375-fe40-481c-a59e-a0f2ae2322bc/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.801597 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df51999-222a-4ef1-a776-5b2c16270039" path="/var/lib/kubelet/pods/5df51999-222a-4ef1-a776-5b2c16270039/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.802300 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" path="/var/lib/kubelet/pods/6d2faec4-82e9-409b-a6c1-93f8cd78b9ec/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.803565 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" path="/var/lib/kubelet/pods/81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.804210 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880f4ab2-4f17-4edf-91f0-6b2fae15c9a9" path="/var/lib/kubelet/pods/880f4ab2-4f17-4edf-91f0-6b2fae15c9a9/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.804870 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a702e1-b24e-4d21-b56a-1e5ec5145565" path="/var/lib/kubelet/pods/a5a702e1-b24e-4d21-b56a-1e5ec5145565/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.805920 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a725db70-608c-4a15-8d30-88bf5dbb764f" path="/var/lib/kubelet/pods/a725db70-608c-4a15-8d30-88bf5dbb764f/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.806553 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" path="/var/lib/kubelet/pods/c559ee21-de8f-44a1-998a-cb0b4aff8cd7/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.807147 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ccf3be-a323-4df6-8c32-c646c4ced20f" path="/var/lib/kubelet/pods/d6ccf3be-a323-4df6-8c32-c646c4ced20f/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.808086 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf" path="/var/lib/kubelet/pods/d7e42473-2988-4fdb-8c8d-55a0d4e3a6bf/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.808524 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc43f91c-775d-4641-9192-53ddf96bd2b2" path="/var/lib/kubelet/pods/dc43f91c-775d-4641-9192-53ddf96bd2b2/volumes" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.821640 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glanceec9f-account-delete-klr92"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.830158 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementf3f4-account-delete-kq7rl"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.838042 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementf3f4-account-delete-kq7rl"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.845624 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.852299 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/196356f3-e866-4cf1-b3e8-eba3d9e4c99f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.853283 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.941478 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.942669 4823 scope.go:117] "RemoveContainer" containerID="48f6096b95361df10996fa9107240728047380521cee4e036be0b67323319318" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.959546 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.976281 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.976577 4823 scope.go:117] "RemoveContainer" containerID="a3797feae0da2f46b99e7827ab8d4f11114590dcdde7cc7247a8b58f538e9505" Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.980901 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.990945 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-59fd5f5fb-h7tf5"] Dec 16 07:22:15 crc kubenswrapper[4823]: I1216 07:22:15.997464 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-59fd5f5fb-h7tf5"] Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.006549 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.007427 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.012242 4823 scope.go:117] "RemoveContainer" containerID="3ce6c26d6258938fda89230a518530d00595939cb83c8d60892c9449174748b0" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.027466 4823 scope.go:117] "RemoveContainer" containerID="530a4f541e791946b14339252ed09b59df393a5827ee6015fa327e0dbbc98aec" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.047679 4823 scope.go:117] "RemoveContainer" containerID="6e803790a094c100a2004f1b22829f8f62d04305a0ff039b94d3de7aaff12828" Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.055447 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.055518 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:22:18.055501085 +0000 UTC m=+1616.544067208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.065445 4823 scope.go:117] "RemoveContainer" containerID="9b756370e64890389fb5a7ac91f02c8282951c0bd28b30fb354e18e101c1af71" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.088902 4823 scope.go:117] "RemoveContainer" containerID="966c9a295917276f353f9e97ebb9a673f7628bec540b8b5ef3aef083889d35ba" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.106341 4823 scope.go:117] "RemoveContainer" containerID="fa7ad139671c8c3444b9e62aff507fb0fc6b2d2d087722f71ba9f8cc7977708c" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.125244 4823 scope.go:117] "RemoveContainer" containerID="754f57f4d21e96f08486902a1f29fc3d73326be71cf93cc74a912ea8e5adfbfe" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.145806 4823 scope.go:117] "RemoveContainer" containerID="f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba" Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157415 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157456 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157485 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157491 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157653 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:18.157554321 +0000 UTC m=+1616.646120514 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157671 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data podName:a686a945-8fa0-406c-ac01-cf061c865a28 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:24.157662165 +0000 UTC m=+1622.646228388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data") pod "rabbitmq-server-0" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28") : configmap "rabbitmq-config-data" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157684 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:18.157679095 +0000 UTC m=+1616.646245318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.157695 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:18.157689696 +0000 UTC m=+1616.646255919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.188497 4823 scope.go:117] "RemoveContainer" containerID="f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba" Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.189391 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba\": container with ID starting with f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba not found: ID does not exist" containerID="f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.189447 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba"} err="failed to get container status \"f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba\": rpc error: code = NotFound desc = could not find container \"f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba\": container with ID starting with f726deb4280e9246c48b014e13b7b17cc95089d7bc4863e4e768298ed64067ba not found: ID does not exist" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.383491 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcd6f8f8f-l8nbv" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.193:5353: i/o timeout" Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.560166 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.562102 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.564674 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.564790 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerName="nova-scheduler-scheduler" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.686057 4823 generic.go:334] "Generic (PLEG): container finished" podID="a686a945-8fa0-406c-ac01-cf061c865a28" containerID="36a98e82cbcb4bee731b20517aebf25ec378c019a17c67f3b8b2c9437196612b" exitCode=0 Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.686333 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a686a945-8fa0-406c-ac01-cf061c865a28","Type":"ContainerDied","Data":"36a98e82cbcb4bee731b20517aebf25ec378c019a17c67f3b8b2c9437196612b"} Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.688314 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell06c77-account-delete-5jkkk" podUID="dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" containerName="mariadb-account-delete" containerID="cri-o://2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6" gracePeriod=30 Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.688492 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder1f3e-account-delete-q5pns" podUID="ec00a24a-8417-452e-a350-b46f36d4a84d" containerName="mariadb-account-delete" containerID="cri-o://d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad" gracePeriod=30 Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.688663 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutronba48-account-delete-87d8j" podUID="4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" containerName="mariadb-account-delete" containerID="cri-o://7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91" gracePeriod=30 Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.769330 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:16 crc kubenswrapper[4823]: E1216 07:22:16.769404 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data podName:cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:24.769387774 +0000 UTC m=+1623.257953897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data") pod "rabbitmq-cell1-server-0" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.885619 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.972886 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.973522 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-erlang-cookie\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.973618 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vq4j\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-kube-api-access-9vq4j\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.973698 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-plugins\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.973838 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-server-conf\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.973945 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.974094 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a686a945-8fa0-406c-ac01-cf061c865a28-erlang-cookie-secret\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.974211 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a686a945-8fa0-406c-ac01-cf061c865a28-pod-info\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.974246 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.974412 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-tls\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.974549 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-confd\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.974667 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-plugins-conf\") pod \"a686a945-8fa0-406c-ac01-cf061c865a28\" (UID: \"a686a945-8fa0-406c-ac01-cf061c865a28\") " Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.975854 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.976001 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.976224 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.976291 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.976356 4823 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.978443 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a686a945-8fa0-406c-ac01-cf061c865a28-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.978504 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-kube-api-access-9vq4j" (OuterVolumeSpecName: "kube-api-access-9vq4j") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "kube-api-access-9vq4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.978539 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.978972 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a686a945-8fa0-406c-ac01-cf061c865a28-pod-info" (OuterVolumeSpecName: "pod-info") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.979554 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:16 crc kubenswrapper[4823]: I1216 07:22:16.998710 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data" (OuterVolumeSpecName: "config-data") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.018926 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-server-conf" (OuterVolumeSpecName: "server-conf") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.068380 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a686a945-8fa0-406c-ac01-cf061c865a28" (UID: "a686a945-8fa0-406c-ac01-cf061c865a28"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078546 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078588 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vq4j\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-kube-api-access-9vq4j\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078602 4823 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078613 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686a945-8fa0-406c-ac01-cf061c865a28-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078626 4823 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a686a945-8fa0-406c-ac01-cf061c865a28-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078636 4823 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a686a945-8fa0-406c-ac01-cf061c865a28-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078646 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.078656 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a686a945-8fa0-406c-ac01-cf061c865a28-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.110549 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.181213 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.452288 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.486910 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-generated\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.486985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-operator-scripts\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.487044 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghctv\" (UniqueName: \"kubernetes.io/projected/dbcff04b-7d0d-45b4-bc28-7882421c6000-kube-api-access-ghctv\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.487100 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-kolla-config\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.487117 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-combined-ca-bundle\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.487166 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-default\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.487197 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-galera-tls-certs\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.487215 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"dbcff04b-7d0d-45b4-bc28-7882421c6000\" (UID: \"dbcff04b-7d0d-45b4-bc28-7882421c6000\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.489860 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.490408 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.491117 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.491768 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.503458 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.510407 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbcff04b-7d0d-45b4-bc28-7882421c6000-kube-api-access-ghctv" (OuterVolumeSpecName: "kube-api-access-ghctv") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "kube-api-access-ghctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.537851 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.541544 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.585396 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "dbcff04b-7d0d-45b4-bc28-7882421c6000" (UID: "dbcff04b-7d0d-45b4-bc28-7882421c6000"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.588916 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-pod-info\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.588972 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-erlang-cookie\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589055 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-plugins-conf\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589107 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwr5v\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-kube-api-access-kwr5v\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589136 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589178 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-erlang-cookie-secret\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589215 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-tls\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589253 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589327 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-confd\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.589361 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-plugins\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590164 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-server-conf\") pod \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\" (UID: \"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1\") " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590567 4823 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590592 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590606 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590618 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590629 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghctv\" (UniqueName: \"kubernetes.io/projected/dbcff04b-7d0d-45b4-bc28-7882421c6000-kube-api-access-ghctv\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590639 4823 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590649 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcff04b-7d0d-45b4-bc28-7882421c6000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.590661 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbcff04b-7d0d-45b4-bc28-7882421c6000-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.591842 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-pod-info" (OuterVolumeSpecName: "pod-info") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.594753 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.599713 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.600447 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.604888 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.607447 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.607622 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-kube-api-access-kwr5v" (OuterVolumeSpecName: "kube-api-access-kwr5v") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "kube-api-access-kwr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.609968 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.619658 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.629387 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data" (OuterVolumeSpecName: "config-data") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.646535 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-server-conf" (OuterVolumeSpecName: "server-conf") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.678223 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" (UID: "cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703603 4823 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703633 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwr5v\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-kube-api-access-kwr5v\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703643 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703651 4823 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703660 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703669 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703689 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703697 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703706 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703714 4823 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703723 4823 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.703731 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.722194 4823 generic.go:334] "Generic (PLEG): container finished" podID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerID="51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720" exitCode=0 Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.722260 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1","Type":"ContainerDied","Data":"51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720"} Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.722287 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1","Type":"ContainerDied","Data":"c545d7e12c64e5493278719a7106677c5060fbade8234638011f610fd4d1cfab"} Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.722303 4823 scope.go:117] "RemoveContainer" containerID="51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.722343 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.729232 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a686a945-8fa0-406c-ac01-cf061c865a28","Type":"ContainerDied","Data":"d342eaa90ec3f7fc03cef38dfcf7f773219dea63e67185b44ac6dff967b46a73"} Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.729386 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.734768 4823 generic.go:334] "Generic (PLEG): container finished" podID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerID="310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede" exitCode=0 Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.734803 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbcff04b-7d0d-45b4-bc28-7882421c6000","Type":"ContainerDied","Data":"310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede"} Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.734827 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbcff04b-7d0d-45b4-bc28-7882421c6000","Type":"ContainerDied","Data":"477e61703af31d689c7c23af31872ff1ab2c4ed808379217e613c31f40aa13d3"} Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.734878 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.741678 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.752775 4823 scope.go:117] "RemoveContainer" containerID="bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.768018 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.788954 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bed5482-3232-4318-b8a0-dcfd51d8611b" path="/var/lib/kubelet/pods/0bed5482-3232-4318-b8a0-dcfd51d8611b/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.789579 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" path="/var/lib/kubelet/pods/196356f3-e866-4cf1-b3e8-eba3d9e4c99f/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.790176 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362dcfe9-8417-425b-8eab-8bd39bf661fc" path="/var/lib/kubelet/pods/362dcfe9-8417-425b-8eab-8bd39bf661fc/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.791151 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" path="/var/lib/kubelet/pods/3eee92de-9c0e-4afd-8a27-52d82caa27ad/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.792165 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" path="/var/lib/kubelet/pods/bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.792734 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" path="/var/lib/kubelet/pods/d3d6c697-a49c-4919-81b5-6899a080d06b/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.793812 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" path="/var/lib/kubelet/pods/dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50/volumes" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.800315 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.805558 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.810248 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.815480 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:22:17 crc kubenswrapper[4823]: E1216 07:22:17.881561 4823 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 07:22:17 crc kubenswrapper[4823]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-16T07:22:10Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 16 07:22:17 crc kubenswrapper[4823]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Dec 16 07:22:17 crc kubenswrapper[4823]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-fvqqp" message=< Dec 16 07:22:17 crc kubenswrapper[4823]: Exiting ovn-controller (1) [FAILED] Dec 16 07:22:17 crc kubenswrapper[4823]: Killing ovn-controller (1) [ OK ] Dec 16 07:22:17 crc kubenswrapper[4823]: Killing ovn-controller (1) with SIGKILL [ OK ] Dec 16 07:22:17 crc kubenswrapper[4823]: 2025-12-16T07:22:10Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 16 07:22:17 crc kubenswrapper[4823]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Dec 16 07:22:17 crc kubenswrapper[4823]: > Dec 16 07:22:17 crc kubenswrapper[4823]: E1216 07:22:17.881629 4823 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 07:22:17 crc kubenswrapper[4823]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-16T07:22:10Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 16 07:22:17 crc kubenswrapper[4823]: /etc/init.d/functions: line 589: 414 Alarm clock "$@" Dec 16 07:22:17 crc kubenswrapper[4823]: > pod="openstack/ovn-controller-fvqqp" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" containerID="cri-o://0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9" Dec 16 07:22:17 crc kubenswrapper[4823]: I1216 07:22:17.881694 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-fvqqp" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" containerID="cri-o://0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9" gracePeriod=22 Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.118336 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.118719 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:22:22.118702615 +0000 UTC m=+1620.607268748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.220165 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.220246 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:22.220227865 +0000 UTC m=+1620.708793988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.220660 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.220698 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:22.220689069 +0000 UTC m=+1620.709255192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.220728 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.220752 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:22.220744081 +0000 UTC m=+1620.709310214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.343662 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.369649 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fvqqp_5fe879e4-70bf-4f38-a4a7-98f5eb23a769/ovn-controller/0.log" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.369730 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.410234 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422497 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422613 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-log-ovn\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422634 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data-custom\") pod \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422666 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-combined-ca-bundle\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422684 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422696 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run-ovn\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422718 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data\") pod \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422747 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg59w\" (UniqueName: \"kubernetes.io/projected/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-kube-api-access-sg59w\") pod \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422808 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-scripts\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422878 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-logs\") pod \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422893 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-875md\" (UniqueName: \"kubernetes.io/projected/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-kube-api-access-875md\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422914 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-combined-ca-bundle\") pod \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\" (UID: \"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.422939 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-ovn-controller-tls-certs\") pod \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\" (UID: \"5fe879e4-70bf-4f38-a4a7-98f5eb23a769\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.423488 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run" (OuterVolumeSpecName: "var-run") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.423557 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.423771 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-logs" (OuterVolumeSpecName: "logs") pod "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" (UID: "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.424068 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.424083 4823 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.424092 4823 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.424100 4823 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.424873 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-scripts" (OuterVolumeSpecName: "scripts") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.438986 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.444006 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" (UID: "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.444562 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-kube-api-access-875md" (OuterVolumeSpecName: "kube-api-access-875md") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "kube-api-access-875md". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.445248 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-kube-api-access-sg59w" (OuterVolumeSpecName: "kube-api-access-sg59w") pod "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" (UID: "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6"). InnerVolumeSpecName "kube-api-access-sg59w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.463479 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.466182 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" (UID: "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.469670 4823 scope.go:117] "RemoveContainer" containerID="51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720" Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.470128 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720\": container with ID starting with 51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720 not found: ID does not exist" containerID="51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.470160 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720"} err="failed to get container status \"51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720\": rpc error: code = NotFound desc = could not find container \"51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720\": container with ID starting with 51ee0e5df9e688f5c88a35c0aa9dd24ecbc2d9cd3579ec6b75a7584a9bee2720 not found: ID does not exist" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.470179 4823 scope.go:117] "RemoveContainer" containerID="bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae" Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.470427 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae\": container with ID starting with bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae not found: ID does not exist" containerID="bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.470445 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae"} err="failed to get container status \"bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae\": rpc error: code = NotFound desc = could not find container \"bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae\": container with ID starting with bdecbd186c280c8e1a08344d429387d2b5b9ce5dc22f4986496eacc03840a6ae not found: ID does not exist" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.470459 4823 scope.go:117] "RemoveContainer" containerID="36a98e82cbcb4bee731b20517aebf25ec378c019a17c67f3b8b2c9437196612b" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.503884 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data" (OuterVolumeSpecName: "config-data") pod "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" (UID: "b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.521853 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5fe879e4-70bf-4f38-a4a7-98f5eb23a769" (UID: "5fe879e4-70bf-4f38-a4a7-98f5eb23a769"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.525320 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-combined-ca-bundle\") pod \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.525371 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-internal-tls-certs\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526055 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-combined-ca-bundle\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526158 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22db0f3f-88b5-4909-aa80-f4b020d1ce18-logs\") pod \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526178 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data\") pod \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526221 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-public-tls-certs\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526260 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-fernet-keys\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526311 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data-custom\") pod \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526328 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-scripts\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526467 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-config-data\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526492 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7pdt\" (UniqueName: \"kubernetes.io/projected/22db0f3f-88b5-4909-aa80-f4b020d1ce18-kube-api-access-m7pdt\") pod \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\" (UID: \"22db0f3f-88b5-4909-aa80-f4b020d1ce18\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526537 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-credential-keys\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.526568 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5tz\" (UniqueName: \"kubernetes.io/projected/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-kube-api-access-nw5tz\") pod \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\" (UID: \"d7a88b40-28bf-4b43-bed8-0b3df3baec5c\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527083 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg59w\" (UniqueName: \"kubernetes.io/projected/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-kube-api-access-sg59w\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527099 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527133 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-875md\" (UniqueName: \"kubernetes.io/projected/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-kube-api-access-875md\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527145 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527154 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527163 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527172 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe879e4-70bf-4f38-a4a7-98f5eb23a769-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.527198 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.537287 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22db0f3f-88b5-4909-aa80-f4b020d1ce18-logs" (OuterVolumeSpecName: "logs") pod "22db0f3f-88b5-4909-aa80-f4b020d1ce18" (UID: "22db0f3f-88b5-4909-aa80-f4b020d1ce18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.546205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.546221 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22db0f3f-88b5-4909-aa80-f4b020d1ce18-kube-api-access-m7pdt" (OuterVolumeSpecName: "kube-api-access-m7pdt") pod "22db0f3f-88b5-4909-aa80-f4b020d1ce18" (UID: "22db0f3f-88b5-4909-aa80-f4b020d1ce18"). InnerVolumeSpecName "kube-api-access-m7pdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.550817 4823 scope.go:117] "RemoveContainer" containerID="0639ca39d4b510f82c5a92153f15cb0546ff06018f3f66e0dd1e7b8d07959478" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.554338 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.556849 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.558195 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-scripts" (OuterVolumeSpecName: "scripts") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.560275 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-kube-api-access-nw5tz" (OuterVolumeSpecName: "kube-api-access-nw5tz") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "kube-api-access-nw5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.569301 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "22db0f3f-88b5-4909-aa80-f4b020d1ce18" (UID: "22db0f3f-88b5-4909-aa80-f4b020d1ce18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.569389 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22db0f3f-88b5-4909-aa80-f4b020d1ce18" (UID: "22db0f3f-88b5-4909-aa80-f4b020d1ce18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.578435 4823 scope.go:117] "RemoveContainer" containerID="310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.591365 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data" (OuterVolumeSpecName: "config-data") pod "22db0f3f-88b5-4909-aa80-f4b020d1ce18" (UID: "22db0f3f-88b5-4909-aa80-f4b020d1ce18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.611232 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.614724 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.614861 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-config-data" (OuterVolumeSpecName: "config-data") pod "d7a88b40-28bf-4b43-bed8-0b3df3baec5c" (UID: "d7a88b40-28bf-4b43-bed8-0b3df3baec5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.619264 4823 scope.go:117] "RemoveContainer" containerID="2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629362 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629396 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7pdt\" (UniqueName: \"kubernetes.io/projected/22db0f3f-88b5-4909-aa80-f4b020d1ce18-kube-api-access-m7pdt\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629407 4823 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629418 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw5tz\" (UniqueName: \"kubernetes.io/projected/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-kube-api-access-nw5tz\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629426 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629436 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629469 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629478 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22db0f3f-88b5-4909-aa80-f4b020d1ce18-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629487 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629495 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629503 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629510 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22db0f3f-88b5-4909-aa80-f4b020d1ce18-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.629520 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88b40-28bf-4b43-bed8-0b3df3baec5c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.662767 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.665332 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.669064 4823 scope.go:117] "RemoveContainer" containerID="310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede" Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.669362 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede\": container with ID starting with 310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede not found: ID does not exist" containerID="310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.669390 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede"} err="failed to get container status \"310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede\": rpc error: code = NotFound desc = could not find container \"310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede\": container with ID starting with 310505b16221fa383a21293252ba7eeff5b379b06f829c1276c31a12ac010ede not found: ID does not exist" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.669407 4823 scope.go:117] "RemoveContainer" containerID="2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551" Dec 16 07:22:18 crc kubenswrapper[4823]: E1216 07:22:18.669655 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551\": container with ID starting with 2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551 not found: ID does not exist" containerID="2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.669697 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551"} err="failed to get container status \"2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551\": rpc error: code = NotFound desc = could not find container \"2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551\": container with ID starting with 2a7d0f9a298fcb19a0fa4b0b8135003ee9e99ea4befbcc1f9f084602c0766551 not found: ID does not exist" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746368 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-combined-ca-bundle\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-combined-ca-bundle\") pod \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746544 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m594w\" (UniqueName: \"kubernetes.io/projected/a18c5d6c-3429-4aa3-b933-85176e0e5ece-kube-api-access-m594w\") pod \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746575 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-scripts\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746611 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-run-httpd\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746682 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-config-data\") pod \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\" (UID: \"a18c5d6c-3429-4aa3-b933-85176e0e5ece\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746712 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-sg-core-conf-yaml\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746748 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-config-data\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746807 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljcbm\" (UniqueName: \"kubernetes.io/projected/77e933eb-7294-47b8-af0c-fbb03725d3d8-kube-api-access-ljcbm\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746851 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-log-httpd\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.746904 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-ceilometer-tls-certs\") pod \"77e933eb-7294-47b8-af0c-fbb03725d3d8\" (UID: \"77e933eb-7294-47b8-af0c-fbb03725d3d8\") " Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.751879 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.753123 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.754893 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e933eb-7294-47b8-af0c-fbb03725d3d8-kube-api-access-ljcbm" (OuterVolumeSpecName: "kube-api-access-ljcbm") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "kube-api-access-ljcbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.755432 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18c5d6c-3429-4aa3-b933-85176e0e5ece-kube-api-access-m594w" (OuterVolumeSpecName: "kube-api-access-m594w") pod "a18c5d6c-3429-4aa3-b933-85176e0e5ece" (UID: "a18c5d6c-3429-4aa3-b933-85176e0e5ece"). InnerVolumeSpecName "kube-api-access-m594w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.758452 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-scripts" (OuterVolumeSpecName: "scripts") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.777580 4823 generic.go:334] "Generic (PLEG): container finished" podID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerID="e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.777649 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" event={"ID":"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6","Type":"ContainerDied","Data":"e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.777677 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" event={"ID":"b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6","Type":"ContainerDied","Data":"65e3b1c7decd0499ef13149a8062101f21080b161212cffa19da508b806df46c"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.777694 4823 scope.go:117] "RemoveContainer" containerID="e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.777855 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-749d6ff74-w7lnp" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.787300 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-config-data" (OuterVolumeSpecName: "config-data") pod "a18c5d6c-3429-4aa3-b933-85176e0e5ece" (UID: "a18c5d6c-3429-4aa3-b933-85176e0e5ece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.788782 4823 generic.go:334] "Generic (PLEG): container finished" podID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerID="cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.788939 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-99f9cf477-cj5ss" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.789043 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-99f9cf477-cj5ss" event={"ID":"22db0f3f-88b5-4909-aa80-f4b020d1ce18","Type":"ContainerDied","Data":"cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.789203 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-99f9cf477-cj5ss" event={"ID":"22db0f3f-88b5-4909-aa80-f4b020d1ce18","Type":"ContainerDied","Data":"04a40d440f394dcb42dbb0ead7abc03d86f229d9020fab91c176e64657413429"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.789748 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.791466 4823 generic.go:334] "Generic (PLEG): container finished" podID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" containerID="51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.791541 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ad8e2a2-14c6-45b5-86f3-e4765cddd777","Type":"ContainerDied","Data":"51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.800947 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fvqqp_5fe879e4-70bf-4f38-a4a7-98f5eb23a769/ovn-controller/0.log" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.801013 4823 generic.go:334] "Generic (PLEG): container finished" podID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerID="0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9" exitCode=137 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.801137 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp" event={"ID":"5fe879e4-70bf-4f38-a4a7-98f5eb23a769","Type":"ContainerDied","Data":"0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.801182 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fvqqp" event={"ID":"5fe879e4-70bf-4f38-a4a7-98f5eb23a769","Type":"ContainerDied","Data":"a2b81c76e6cce262197b4a0317a6d19dc1b5e9e49d911f38fef703c2a4247695"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.801146 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fvqqp" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.803382 4823 generic.go:334] "Generic (PLEG): container finished" podID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.803452 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a18c5d6c-3429-4aa3-b933-85176e0e5ece","Type":"ContainerDied","Data":"3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.803481 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a18c5d6c-3429-4aa3-b933-85176e0e5ece","Type":"ContainerDied","Data":"b8f8e19bb3fc5f06b6f1cb8ab0b5c114739dfe21c06dc6eb8db75a76961ced2f"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.803600 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.811457 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a18c5d6c-3429-4aa3-b933-85176e0e5ece" (UID: "a18c5d6c-3429-4aa3-b933-85176e0e5ece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.816227 4823 scope.go:117] "RemoveContainer" containerID="8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.818725 4823 generic.go:334] "Generic (PLEG): container finished" podID="79a24114-2ee1-4cc0-9045-770fcf074950" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.818794 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"79a24114-2ee1-4cc0-9045-770fcf074950","Type":"ContainerDied","Data":"e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.828664 4823 generic.go:334] "Generic (PLEG): container finished" podID="d7a88b40-28bf-4b43-bed8-0b3df3baec5c" containerID="c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.828746 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c7767d9f4-5rbv6" event={"ID":"d7a88b40-28bf-4b43-bed8-0b3df3baec5c","Type":"ContainerDied","Data":"c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.828774 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c7767d9f4-5rbv6" event={"ID":"d7a88b40-28bf-4b43-bed8-0b3df3baec5c","Type":"ContainerDied","Data":"bb835f7ff7aad3e6fa75a7c0216849fa0da434bcb6199e32d801c9063e346a67"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.828875 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c7767d9f4-5rbv6" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.831193 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-99f9cf477-cj5ss"] Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.837219 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-99f9cf477-cj5ss"] Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850079 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850111 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m594w\" (UniqueName: \"kubernetes.io/projected/a18c5d6c-3429-4aa3-b933-85176e0e5ece-kube-api-access-m594w\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850121 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850131 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850140 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a18c5d6c-3429-4aa3-b933-85176e0e5ece-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850148 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850157 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljcbm\" (UniqueName: \"kubernetes.io/projected/77e933eb-7294-47b8-af0c-fbb03725d3d8-kube-api-access-ljcbm\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.850165 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77e933eb-7294-47b8-af0c-fbb03725d3d8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.853514 4823 generic.go:334] "Generic (PLEG): container finished" podID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerID="c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294" exitCode=0 Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.853549 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerDied","Data":"c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.853574 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77e933eb-7294-47b8-af0c-fbb03725d3d8","Type":"ContainerDied","Data":"d1eb8faa55599142c8d064c3cde3f9380cfdbbedc01cb4c2b32d4809cdfc6ff7"} Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.853641 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.858676 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.875582 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.880261 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-config-data" (OuterVolumeSpecName: "config-data") pod "77e933eb-7294-47b8-af0c-fbb03725d3d8" (UID: "77e933eb-7294-47b8-af0c-fbb03725d3d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.951475 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.951514 4823 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:18 crc kubenswrapper[4823]: I1216 07:22:18.951530 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e933eb-7294-47b8-af0c-fbb03725d3d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:18.999945 4823 scope.go:117] "RemoveContainer" containerID="e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.000852 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c\": container with ID starting with e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c not found: ID does not exist" containerID="e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.000882 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c"} err="failed to get container status \"e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c\": rpc error: code = NotFound desc = could not find container \"e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c\": container with ID starting with e8fc2c585ca3548f97ebfebb7bf47c7049bc68135e31ef4c2ea178062973a34c not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.000903 4823 scope.go:117] "RemoveContainer" containerID="8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.005101 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da\": container with ID starting with 8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da not found: ID does not exist" containerID="8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.005156 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da"} err="failed to get container status \"8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da\": rpc error: code = NotFound desc = could not find container \"8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da\": container with ID starting with 8ad897558a4d1aff0a9610d56a52fa00c8ef0a67fe7b4ed5924be748f672b9da not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.005189 4823 scope.go:117] "RemoveContainer" containerID="cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.013118 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-749d6ff74-w7lnp"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.021122 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-749d6ff74-w7lnp"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.021417 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.031313 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fvqqp"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.035901 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fvqqp"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.044273 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6c7767d9f4-5rbv6"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.044627 4823 scope.go:117] "RemoveContainer" containerID="831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.048102 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6c7767d9f4-5rbv6"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.068367 4823 scope.go:117] "RemoveContainer" containerID="cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.068821 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a\": container with ID starting with cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a not found: ID does not exist" containerID="cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.068915 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a"} err="failed to get container status \"cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a\": rpc error: code = NotFound desc = could not find container \"cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a\": container with ID starting with cf30b2b23d3cfc03fbaa30e01fba97a5a84312c839e774aea9c6f64e79f21e6a not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.069014 4823 scope.go:117] "RemoveContainer" containerID="831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.069316 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d\": container with ID starting with 831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d not found: ID does not exist" containerID="831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.069398 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d"} err="failed to get container status \"831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d\": rpc error: code = NotFound desc = could not find container \"831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d\": container with ID starting with 831ecac837adb99ec40863b494bf97107b9ae0dabceaa9308863d145e46da25d not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.069460 4823 scope.go:117] "RemoveContainer" containerID="0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.090854 4823 scope.go:117] "RemoveContainer" containerID="0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.091429 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9\": container with ID starting with 0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9 not found: ID does not exist" containerID="0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.091487 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9"} err="failed to get container status \"0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9\": rpc error: code = NotFound desc = could not find container \"0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9\": container with ID starting with 0a921f2564997ac6138f43048f8c54eaec84f63860087b54d2b224f1860777a9 not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.091520 4823 scope.go:117] "RemoveContainer" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.154766 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-combined-ca-bundle\") pod \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.154867 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvqlz\" (UniqueName: \"kubernetes.io/projected/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-kube-api-access-bvqlz\") pod \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.154928 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-config-data\") pod \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\" (UID: \"7ad8e2a2-14c6-45b5-86f3-e4765cddd777\") " Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.159541 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-kube-api-access-bvqlz" (OuterVolumeSpecName: "kube-api-access-bvqlz") pod "7ad8e2a2-14c6-45b5-86f3-e4765cddd777" (UID: "7ad8e2a2-14c6-45b5-86f3-e4765cddd777"). InnerVolumeSpecName "kube-api-access-bvqlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.164416 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.164518 4823 scope.go:117] "RemoveContainer" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.164970 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1\": container with ID starting with 3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1 not found: ID does not exist" containerID="3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.165003 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1"} err="failed to get container status \"3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1\": rpc error: code = NotFound desc = could not find container \"3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1\": container with ID starting with 3650c948a8c447960d8862d51fa7c7f894dc0569eb954a455acb9fd443967ab1 not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.165040 4823 scope.go:117] "RemoveContainer" containerID="c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.181382 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-config-data" (OuterVolumeSpecName: "config-data") pod "7ad8e2a2-14c6-45b5-86f3-e4765cddd777" (UID: "7ad8e2a2-14c6-45b5-86f3-e4765cddd777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.188533 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.190135 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ad8e2a2-14c6-45b5-86f3-e4765cddd777" (UID: "7ad8e2a2-14c6-45b5-86f3-e4765cddd777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.205075 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.217815 4823 scope.go:117] "RemoveContainer" containerID="c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.218442 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4\": container with ID starting with c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4 not found: ID does not exist" containerID="c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.218483 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4"} err="failed to get container status \"c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4\": rpc error: code = NotFound desc = could not find container \"c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4\": container with ID starting with c0cd38487b75afdb67a7225ee2f0fe111d46a163417ffe7f85edb1cbb15aead4 not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.218511 4823 scope.go:117] "RemoveContainer" containerID="42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.223130 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.228794 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.242528 4823 scope.go:117] "RemoveContainer" containerID="eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.256485 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-combined-ca-bundle\") pod \"79a24114-2ee1-4cc0-9045-770fcf074950\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.256596 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfk5f\" (UniqueName: \"kubernetes.io/projected/79a24114-2ee1-4cc0-9045-770fcf074950-kube-api-access-zfk5f\") pod \"79a24114-2ee1-4cc0-9045-770fcf074950\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.256652 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-config-data\") pod \"79a24114-2ee1-4cc0-9045-770fcf074950\" (UID: \"79a24114-2ee1-4cc0-9045-770fcf074950\") " Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.256949 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.256968 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvqlz\" (UniqueName: \"kubernetes.io/projected/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-kube-api-access-bvqlz\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.256979 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ad8e2a2-14c6-45b5-86f3-e4765cddd777-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.259806 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a24114-2ee1-4cc0-9045-770fcf074950-kube-api-access-zfk5f" (OuterVolumeSpecName: "kube-api-access-zfk5f") pod "79a24114-2ee1-4cc0-9045-770fcf074950" (UID: "79a24114-2ee1-4cc0-9045-770fcf074950"). InnerVolumeSpecName "kube-api-access-zfk5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.262816 4823 scope.go:117] "RemoveContainer" containerID="c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.275980 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79a24114-2ee1-4cc0-9045-770fcf074950" (UID: "79a24114-2ee1-4cc0-9045-770fcf074950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.277243 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-config-data" (OuterVolumeSpecName: "config-data") pod "79a24114-2ee1-4cc0-9045-770fcf074950" (UID: "79a24114-2ee1-4cc0-9045-770fcf074950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.280835 4823 scope.go:117] "RemoveContainer" containerID="fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.300273 4823 scope.go:117] "RemoveContainer" containerID="42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.300765 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214\": container with ID starting with 42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214 not found: ID does not exist" containerID="42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.300793 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214"} err="failed to get container status \"42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214\": rpc error: code = NotFound desc = could not find container \"42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214\": container with ID starting with 42c46777cdba45701a8cea0658eed9dc6daf90aae7d325c8008f6934dfe32214 not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.300812 4823 scope.go:117] "RemoveContainer" containerID="eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.301175 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813\": container with ID starting with eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813 not found: ID does not exist" containerID="eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.301196 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813"} err="failed to get container status \"eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813\": rpc error: code = NotFound desc = could not find container \"eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813\": container with ID starting with eb53ac47f3a5804dff14f24e15141b3633409733b254cc498392acbc24442813 not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.301208 4823 scope.go:117] "RemoveContainer" containerID="c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.301524 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294\": container with ID starting with c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294 not found: ID does not exist" containerID="c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.301545 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294"} err="failed to get container status \"c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294\": rpc error: code = NotFound desc = could not find container \"c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294\": container with ID starting with c5e2a5c0a31ec4e9f5ca94f30d407742eb67d1531e0744587618b59f17762294 not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.301557 4823 scope.go:117] "RemoveContainer" containerID="fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.301813 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c\": container with ID starting with fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c not found: ID does not exist" containerID="fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.301835 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c"} err="failed to get container status \"fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c\": rpc error: code = NotFound desc = could not find container \"fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c\": container with ID starting with fdc90f0e714e4f423158468800e55aec113c93ffa463f7e4d06ce66853197e2c not found: ID does not exist" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.358483 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfk5f\" (UniqueName: \"kubernetes.io/projected/79a24114-2ee1-4cc0-9045-770fcf074950-kube-api-access-zfk5f\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.358552 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.358563 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a24114-2ee1-4cc0-9045-770fcf074950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.566110 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.566349 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.772292 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.772847 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.781777 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" path="/var/lib/kubelet/pods/22db0f3f-88b5-4909-aa80-f4b020d1ce18/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.782850 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" path="/var/lib/kubelet/pods/5fe879e4-70bf-4f38-a4a7-98f5eb23a769/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.783795 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" path="/var/lib/kubelet/pods/77e933eb-7294-47b8-af0c-fbb03725d3d8/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.785561 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" path="/var/lib/kubelet/pods/a18c5d6c-3429-4aa3-b933-85176e0e5ece/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.786616 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" path="/var/lib/kubelet/pods/a686a945-8fa0-406c-ac01-cf061c865a28/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.788375 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" path="/var/lib/kubelet/pods/b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.789639 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" path="/var/lib/kubelet/pods/cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.790505 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a88b40-28bf-4b43-bed8-0b3df3baec5c" path="/var/lib/kubelet/pods/d7a88b40-28bf-4b43-bed8-0b3df3baec5c/volumes" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.868676 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"79a24114-2ee1-4cc0-9045-770fcf074950","Type":"ContainerDied","Data":"16e9052b9673da895f792777b9356bdea60548993b3b320610c25c067da7b775"} Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.868735 4823 scope.go:117] "RemoveContainer" containerID="e121b5fc19f8847f31857c92e1abac87de929236af3edad6305ba6de36abc8a3" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.868753 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.891588 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7ad8e2a2-14c6-45b5-86f3-e4765cddd777","Type":"ContainerDied","Data":"1adc30d428b5b9d7595ca81cc6e015f7c66aa26f79ddf2c7abdc5fafc704a8df"} Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.891935 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.897996 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.900902 4823 scope.go:117] "RemoveContainer" containerID="51e78213653e84ab99bbc7625548d55635dfcb54de59c8fec91ff584c2afb7a9" Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.909908 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.925789 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: I1216 07:22:19.938397 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.945516 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.945992 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.946288 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.946354 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.947176 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.948997 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.950632 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:19 crc kubenswrapper[4823]: E1216 07:22:19.950675 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:22:21 crc kubenswrapper[4823]: I1216 07:22:21.787124 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" path="/var/lib/kubelet/pods/79a24114-2ee1-4cc0-9045-770fcf074950/volumes" Dec 16 07:22:21 crc kubenswrapper[4823]: I1216 07:22:21.788236 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" path="/var/lib/kubelet/pods/7ad8e2a2-14c6-45b5-86f3-e4765cddd777/volumes" Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.206602 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.207120 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:22:30.207079464 +0000 UTC m=+1628.695645597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.308921 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.308974 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.309058 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:30.309009136 +0000 UTC m=+1628.797575269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.309097 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:30.309088728 +0000 UTC m=+1628.797654861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.309197 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:22 crc kubenswrapper[4823]: E1216 07:22:22.309301 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:30.309276944 +0000 UTC m=+1628.797843077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.947902 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.948746 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.948807 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.949418 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.949452 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.950277 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.951781 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:24 crc kubenswrapper[4823]: E1216 07:22:24.951860 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.945648 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.947526 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.947870 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.948128 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.948161 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.952517 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.954544 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:29 crc kubenswrapper[4823]: E1216 07:22:29.954596 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.234922 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.234990 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:22:46.234976852 +0000 UTC m=+1644.723542975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.336894 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.336962 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:46.336948587 +0000 UTC m=+1644.825514710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.337309 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.337337 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:46.337328959 +0000 UTC m=+1644.825895082 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.337503 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:30 crc kubenswrapper[4823]: E1216 07:22:30.337668 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:22:46.33765277 +0000 UTC m=+1644.826218913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.041781 4823 generic.go:334] "Generic (PLEG): container finished" podID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerID="4a902115438412f167a7c224fe223d644746f437002cb2288beb05ad185be48a" exitCode=0 Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.041882 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf9986cc-sjljb" event={"ID":"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b","Type":"ContainerDied","Data":"4a902115438412f167a7c224fe223d644746f437002cb2288beb05ad185be48a"} Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.503065 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.575745 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-config\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.576175 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-public-tls-certs\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.576234 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-ovndb-tls-certs\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.576269 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-combined-ca-bundle\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.576361 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbtbf\" (UniqueName: \"kubernetes.io/projected/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-kube-api-access-hbtbf\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.576435 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-internal-tls-certs\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.576574 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-httpd-config\") pod \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\" (UID: \"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b\") " Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.582090 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.583599 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-kube-api-access-hbtbf" (OuterVolumeSpecName: "kube-api-access-hbtbf") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "kube-api-access-hbtbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.615634 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.616673 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-config" (OuterVolumeSpecName: "config") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.618091 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.623842 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.638254 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" (UID: "f2b1ed60-7cb0-48f0-aebf-3de778dbb95b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679297 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679341 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679356 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679397 4823 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679430 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679443 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbtbf\" (UniqueName: \"kubernetes.io/projected/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-kube-api-access-hbtbf\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.679457 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:32 crc kubenswrapper[4823]: I1216 07:22:32.771818 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:22:32 crc kubenswrapper[4823]: E1216 07:22:32.772240 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.054969 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf9986cc-sjljb" event={"ID":"f2b1ed60-7cb0-48f0-aebf-3de778dbb95b","Type":"ContainerDied","Data":"b946699914bb5413be32189c06d97b3818db2811af84ebe07b2bb0e71fc2447b"} Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.055067 4823 scope.go:117] "RemoveContainer" containerID="63b9a035e047de6a0a1943c6d043167a9dedd896ef10da24426158630e0de9b7" Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.055162 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bbf9986cc-sjljb" Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.087741 4823 scope.go:117] "RemoveContainer" containerID="4a902115438412f167a7c224fe223d644746f437002cb2288beb05ad185be48a" Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.109359 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bbf9986cc-sjljb"] Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.114703 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bbf9986cc-sjljb"] Dec 16 07:22:33 crc kubenswrapper[4823]: I1216 07:22:33.787749 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" path="/var/lib/kubelet/pods/f2b1ed60-7cb0-48f0-aebf-3de778dbb95b/volumes" Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.944771 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.945408 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.946189 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.946242 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.947010 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.948427 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.950095 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:34 crc kubenswrapper[4823]: E1216 07:22:34.950135 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.946621 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0 is running failed: container process not found" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.946869 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.948168 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.948187 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0 is running failed: container process not found" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.948784 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0 is running failed: container process not found" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.948843 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.949779 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:22:39 crc kubenswrapper[4823]: E1216 07:22:39.949861 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-29jcz" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.141106 4823 generic.go:334] "Generic (PLEG): container finished" podID="37eade87-02f6-4584-87d3-9b22e16ad915" containerID="f87675dcfff9fc973b357762f0993278cb4dedf83d6ea269b8db0911d6c505df" exitCode=137 Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.141260 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"f87675dcfff9fc973b357762f0993278cb4dedf83d6ea269b8db0911d6c505df"} Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.149600 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-29jcz_4edb9072-dfce-44ca-88d3-64136ac7e1c3/ovs-vswitchd/0.log" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.150613 4823 generic.go:334] "Generic (PLEG): container finished" podID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" exitCode=137 Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.150666 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerDied","Data":"143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0"} Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.261282 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-29jcz_4edb9072-dfce-44ca-88d3-64136ac7e1c3/ovs-vswitchd/0.log" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.262908 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422120 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb9072-dfce-44ca-88d3-64136ac7e1c3-scripts\") pod \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422182 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-etc-ovs\") pod \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422260 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-lib\") pod \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422362 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "4edb9072-dfce-44ca-88d3-64136ac7e1c3" (UID: "4edb9072-dfce-44ca-88d3-64136ac7e1c3"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422372 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-run\") pod \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422407 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-run" (OuterVolumeSpecName: "var-run") pod "4edb9072-dfce-44ca-88d3-64136ac7e1c3" (UID: "4edb9072-dfce-44ca-88d3-64136ac7e1c3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422463 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6pr4\" (UniqueName: \"kubernetes.io/projected/4edb9072-dfce-44ca-88d3-64136ac7e1c3-kube-api-access-d6pr4\") pod \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422466 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-lib" (OuterVolumeSpecName: "var-lib") pod "4edb9072-dfce-44ca-88d3-64136ac7e1c3" (UID: "4edb9072-dfce-44ca-88d3-64136ac7e1c3"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422517 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-log\") pod \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\" (UID: \"4edb9072-dfce-44ca-88d3-64136ac7e1c3\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422704 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-log" (OuterVolumeSpecName: "var-log") pod "4edb9072-dfce-44ca-88d3-64136ac7e1c3" (UID: "4edb9072-dfce-44ca-88d3-64136ac7e1c3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422908 4823 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422925 4823 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422936 4823 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-lib\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.422948 4823 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4edb9072-dfce-44ca-88d3-64136ac7e1c3-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.423531 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edb9072-dfce-44ca-88d3-64136ac7e1c3-scripts" (OuterVolumeSpecName: "scripts") pod "4edb9072-dfce-44ca-88d3-64136ac7e1c3" (UID: "4edb9072-dfce-44ca-88d3-64136ac7e1c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.427453 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edb9072-dfce-44ca-88d3-64136ac7e1c3-kube-api-access-d6pr4" (OuterVolumeSpecName: "kube-api-access-d6pr4") pod "4edb9072-dfce-44ca-88d3-64136ac7e1c3" (UID: "4edb9072-dfce-44ca-88d3-64136ac7e1c3"). InnerVolumeSpecName "kube-api-access-d6pr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.437407 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.524245 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gvpn\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-kube-api-access-5gvpn\") pod \"37eade87-02f6-4584-87d3-9b22e16ad915\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.524310 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-lock\") pod \"37eade87-02f6-4584-87d3-9b22e16ad915\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.524347 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-cache\") pod \"37eade87-02f6-4584-87d3-9b22e16ad915\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.524458 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"37eade87-02f6-4584-87d3-9b22e16ad915\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.524524 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") pod \"37eade87-02f6-4584-87d3-9b22e16ad915\" (UID: \"37eade87-02f6-4584-87d3-9b22e16ad915\") " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.525131 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-cache" (OuterVolumeSpecName: "cache") pod "37eade87-02f6-4584-87d3-9b22e16ad915" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.525258 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-lock" (OuterVolumeSpecName: "lock") pod "37eade87-02f6-4584-87d3-9b22e16ad915" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.525446 4823 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-lock\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.525459 4823 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/37eade87-02f6-4584-87d3-9b22e16ad915-cache\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.525468 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6pr4\" (UniqueName: \"kubernetes.io/projected/4edb9072-dfce-44ca-88d3-64136ac7e1c3-kube-api-access-d6pr4\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.525478 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb9072-dfce-44ca-88d3-64136ac7e1c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.527681 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-kube-api-access-5gvpn" (OuterVolumeSpecName: "kube-api-access-5gvpn") pod "37eade87-02f6-4584-87d3-9b22e16ad915" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915"). InnerVolumeSpecName "kube-api-access-5gvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.527745 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "37eade87-02f6-4584-87d3-9b22e16ad915" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.527765 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "37eade87-02f6-4584-87d3-9b22e16ad915" (UID: "37eade87-02f6-4584-87d3-9b22e16ad915"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.627138 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.627166 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.627176 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gvpn\" (UniqueName: \"kubernetes.io/projected/37eade87-02f6-4584-87d3-9b22e16ad915-kube-api-access-5gvpn\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.640611 4823 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 16 07:22:40 crc kubenswrapper[4823]: I1216 07:22:40.728595 4823 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:40 crc kubenswrapper[4823]: E1216 07:22:40.878346 4823 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/9c4a6749466f0b0f0e235448607727d282344fafca9d28609cc6543cd453127c/diff" to get inode usage: stat /var/lib/containers/storage/overlay/9c4a6749466f0b0f0e235448607727d282344fafca9d28609cc6543cd453127c/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_neutron-bbf9986cc-sjljb_f2b1ed60-7cb0-48f0-aebf-3de778dbb95b/neutron-api/0.log" to get inode usage: stat /var/log/pods/openstack_neutron-bbf9986cc-sjljb_f2b1ed60-7cb0-48f0-aebf-3de778dbb95b/neutron-api/0.log: no such file or directory Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.176121 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"37eade87-02f6-4584-87d3-9b22e16ad915","Type":"ContainerDied","Data":"e74114a842e19517f0819c0014b296da5863d1ef5807dc31915c92d2558cf539"} Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.176216 4823 scope.go:117] "RemoveContainer" containerID="f87675dcfff9fc973b357762f0993278cb4dedf83d6ea269b8db0911d6c505df" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.176260 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.180810 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-29jcz_4edb9072-dfce-44ca-88d3-64136ac7e1c3/ovs-vswitchd/0.log" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.182156 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-29jcz" event={"ID":"4edb9072-dfce-44ca-88d3-64136ac7e1c3","Type":"ContainerDied","Data":"235232dd5c4000cb81bcdc3a65e84dd1780c277e0d516b25d57d5ed080d7f45e"} Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.182279 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-29jcz" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.207217 4823 scope.go:117] "RemoveContainer" containerID="145ae0bd995a296d5194b205c5a110eae0cc0b53171f8ed6f7aab0a0e2c48aca" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.241813 4823 scope.go:117] "RemoveContainer" containerID="fba9f42156608e6cc226456c4628eb8a6093a4e736f19553c3b609538523e305" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.249383 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-29jcz"] Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.258597 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-29jcz"] Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.265199 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.274341 4823 scope.go:117] "RemoveContainer" containerID="40ee29e6ae29936dd852b2034a257b376daf068184e991736706829246c42569" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.275803 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.296855 4823 scope.go:117] "RemoveContainer" containerID="b1a5f1f8235f35f66a00999ce9d7e06be67e6583b5dc430df80fd71d14a63993" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.322299 4823 scope.go:117] "RemoveContainer" containerID="c5d6df967dd64ce250c15ed15f061a8be5c2ace3ce71f17ecbb4eeb82eee16bb" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.363224 4823 scope.go:117] "RemoveContainer" containerID="01e5b8f2f03cdaee2d9aa0f7009e062e757b69095af1ac126d2b409afda22307" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.386576 4823 scope.go:117] "RemoveContainer" containerID="0867818f24c8ec64e592ab31aa5d2950ef78f3e7e0fe1694feaadae8d16fd195" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.411193 4823 scope.go:117] "RemoveContainer" containerID="bad977d222921a4fb519d95600bc9d018f6a41b0993e19b99e544f9729b364ec" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.437733 4823 scope.go:117] "RemoveContainer" containerID="3e8bd97535cc7d73ba58df356afd74ec5adc282b78f6bd60a29d41243373dfe8" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.467927 4823 scope.go:117] "RemoveContainer" containerID="6989d85752f4e1b6c7b23a46754686007edf09212f93e356aa9e002490d63f86" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.488688 4823 scope.go:117] "RemoveContainer" containerID="9857b55eb51a54f3ae493111d268c42a0d2bc195ef3b7082fc757220e93cba07" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.509076 4823 scope.go:117] "RemoveContainer" containerID="037ada7a883b0afa2d539ebbbabaf8e1ff97dd775ed349460d0029680d2b1517" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.527562 4823 scope.go:117] "RemoveContainer" containerID="e40b9ddd3f7fc60ce93f808d19e11679050ad9b41de42d02b22ca40a92083f09" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.546563 4823 scope.go:117] "RemoveContainer" containerID="a492d0597a24fbc3874db2d66724810617a47a1b04e07bd6166546bf01c14b03" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.563846 4823 scope.go:117] "RemoveContainer" containerID="143890d9503ac11d18cb9ffe222557c6fbf01e56e0ee7fe6c9718deb211756f0" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.579941 4823 scope.go:117] "RemoveContainer" containerID="a75ddf05569eb842ee7ced1c6941dc65b4fda1943932e8021d1066d4d5ed3578" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.597511 4823 scope.go:117] "RemoveContainer" containerID="4e9fde7fbe0438d93c11da4a80083cc4d8cfbd62271f2ea000e704d3bf65f337" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.788006 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" path="/var/lib/kubelet/pods/37eade87-02f6-4584-87d3-9b22e16ad915/volumes" Dec 16 07:22:41 crc kubenswrapper[4823]: I1216 07:22:41.792346 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" path="/var/lib/kubelet/pods/4edb9072-dfce-44ca-88d3-64136ac7e1c3/volumes" Dec 16 07:22:42 crc kubenswrapper[4823]: I1216 07:22:42.272740 4823 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podefe17b2e-19bd-430b-8cb5-147ed1d2ffb6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podefe17b2e-19bd-430b-8cb5-147ed1d2ffb6] : Timed out while waiting for systemd to remove kubepods-besteffort-podefe17b2e_19bd_430b_8cb5_147ed1d2ffb6.slice" Dec 16 07:22:42 crc kubenswrapper[4823]: E1216 07:22:42.273477 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podefe17b2e-19bd-430b-8cb5-147ed1d2ffb6] : unable to destroy cgroup paths for cgroup [kubepods besteffort podefe17b2e-19bd-430b-8cb5-147ed1d2ffb6] : Timed out while waiting for systemd to remove kubepods-besteffort-podefe17b2e_19bd_430b_8cb5_147ed1d2ffb6.slice" pod="openstack/ovn-controller-metrics-956hc" podUID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" Dec 16 07:22:43 crc kubenswrapper[4823]: I1216 07:22:43.208677 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-956hc" Dec 16 07:22:43 crc kubenswrapper[4823]: I1216 07:22:43.253988 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-956hc"] Dec 16 07:22:43 crc kubenswrapper[4823]: I1216 07:22:43.263565 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-956hc"] Dec 16 07:22:43 crc kubenswrapper[4823]: I1216 07:22:43.781137 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" path="/var/lib/kubelet/pods/efe17b2e-19bd-430b-8cb5-147ed1d2ffb6/volumes" Dec 16 07:22:44 crc kubenswrapper[4823]: I1216 07:22:44.772689 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:22:44 crc kubenswrapper[4823]: E1216 07:22:44.773338 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.237596 4823 generic.go:334] "Generic (PLEG): container finished" podID="65278526-b5ee-4e40-b66b-1ee9b993f429" containerID="329a04c4c9ff60b70d5395a727c15217c7dff6014bc04044a76975b137573df3" exitCode=137 Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.237667 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapic1ba-account-delete-sldxg" event={"ID":"65278526-b5ee-4e40-b66b-1ee9b993f429","Type":"ContainerDied","Data":"329a04c4c9ff60b70d5395a727c15217c7dff6014bc04044a76975b137573df3"} Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.319083 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.319151 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts podName:ec00a24a-8417-452e-a350-b46f36d4a84d nodeName:}" failed. No retries permitted until 2025-12-16 07:23:18.319134106 +0000 UTC m=+1676.807700229 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts") pod "cinder1f3e-account-delete-q5pns" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d") : configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.420727 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.420795 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts podName:65278526-b5ee-4e40-b66b-1ee9b993f429 nodeName:}" failed. No retries permitted until 2025-12-16 07:23:18.420781731 +0000 UTC m=+1676.909347854 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts") pod "novaapic1ba-account-delete-sldxg" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429") : configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.420799 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.420897 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts podName:4a3f54ee-1dba-42f5-8697-b70de0f5b4c2 nodeName:}" failed. No retries permitted until 2025-12-16 07:23:18.420880754 +0000 UTC m=+1676.909446877 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts") pod "neutronba48-account-delete-87d8j" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2") : configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.420731 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.420935 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts podName:dfd7efdc-36ba-4037-9f6c-a1a8c946ab33 nodeName:}" failed. No retries permitted until 2025-12-16 07:23:18.420927785 +0000 UTC m=+1676.909493898 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts") pod "novacell06c77-account-delete-5jkkk" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33") : configmap "openstack-scripts" not found Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.501652 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.623205 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts\") pod \"65278526-b5ee-4e40-b66b-1ee9b993f429\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.623261 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5qhh\" (UniqueName: \"kubernetes.io/projected/65278526-b5ee-4e40-b66b-1ee9b993f429-kube-api-access-p5qhh\") pod \"65278526-b5ee-4e40-b66b-1ee9b993f429\" (UID: \"65278526-b5ee-4e40-b66b-1ee9b993f429\") " Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.624614 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65278526-b5ee-4e40-b66b-1ee9b993f429" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.627973 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65278526-b5ee-4e40-b66b-1ee9b993f429-kube-api-access-p5qhh" (OuterVolumeSpecName: "kube-api-access-p5qhh") pod "65278526-b5ee-4e40-b66b-1ee9b993f429" (UID: "65278526-b5ee-4e40-b66b-1ee9b993f429"). InnerVolumeSpecName "kube-api-access-p5qhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.725491 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65278526-b5ee-4e40-b66b-1ee9b993f429-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:46 crc kubenswrapper[4823]: I1216 07:22:46.725537 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5qhh\" (UniqueName: \"kubernetes.io/projected/65278526-b5ee-4e40-b66b-1ee9b993f429-kube-api-access-p5qhh\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:46 crc kubenswrapper[4823]: E1216 07:22:46.962211 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec00a24a_8417_452e_a350_b46f36d4a84d.slice/crio-conmon-d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a3f54ee_1dba_42f5_8697_b70de0f5b4c2.slice/crio-conmon-7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.214211 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.220836 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.229452 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.252699 4823 generic.go:334] "Generic (PLEG): container finished" podID="ec00a24a-8417-452e-a350-b46f36d4a84d" containerID="d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad" exitCode=137 Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.252789 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder1f3e-account-delete-q5pns" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.252909 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1f3e-account-delete-q5pns" event={"ID":"ec00a24a-8417-452e-a350-b46f36d4a84d","Type":"ContainerDied","Data":"d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.252936 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder1f3e-account-delete-q5pns" event={"ID":"ec00a24a-8417-452e-a350-b46f36d4a84d","Type":"ContainerDied","Data":"fc21714557f1933f1c54e52ba6c8488cf3c0a643887a4313a7800d55d3a72eb1"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.252969 4823 scope.go:117] "RemoveContainer" containerID="d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.254194 4823 generic.go:334] "Generic (PLEG): container finished" podID="dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" containerID="2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6" exitCode=137 Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.254243 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell06c77-account-delete-5jkkk" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.254242 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell06c77-account-delete-5jkkk" event={"ID":"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33","Type":"ContainerDied","Data":"2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.254414 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell06c77-account-delete-5jkkk" event={"ID":"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33","Type":"ContainerDied","Data":"73eafc561fc77a9630134c39e4984c1ab313fe1246675f93fff1e81ec4c4178c"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.256120 4823 generic.go:334] "Generic (PLEG): container finished" podID="4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" containerID="7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91" exitCode=137 Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.256200 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronba48-account-delete-87d8j" event={"ID":"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2","Type":"ContainerDied","Data":"7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.256223 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronba48-account-delete-87d8j" event={"ID":"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2","Type":"ContainerDied","Data":"82294d0918b15b9d66958622c0c2cbf6ca03371d7da85f785b9259c3c2e86a07"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.256271 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronba48-account-delete-87d8j" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.275508 4823 scope.go:117] "RemoveContainer" containerID="d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad" Dec 16 07:22:47 crc kubenswrapper[4823]: E1216 07:22:47.275805 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad\": container with ID starting with d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad not found: ID does not exist" containerID="d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.275854 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad"} err="failed to get container status \"d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad\": rpc error: code = NotFound desc = could not find container \"d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad\": container with ID starting with d094c25706d601af35d0ca55d737a0c04cbd27e5d230d6ca4704d7d20a91aaad not found: ID does not exist" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.275874 4823 scope.go:117] "RemoveContainer" containerID="2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.277596 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapic1ba-account-delete-sldxg" event={"ID":"65278526-b5ee-4e40-b66b-1ee9b993f429","Type":"ContainerDied","Data":"c9b587abd6e349be75aa6251d530d3502453499c8ac2414b5493dd65b04180e1"} Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.277678 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapic1ba-account-delete-sldxg" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.312256 4823 scope.go:117] "RemoveContainer" containerID="2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6" Dec 16 07:22:47 crc kubenswrapper[4823]: E1216 07:22:47.312912 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6\": container with ID starting with 2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6 not found: ID does not exist" containerID="2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.312956 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6"} err="failed to get container status \"2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6\": rpc error: code = NotFound desc = could not find container \"2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6\": container with ID starting with 2ab531bd1e295bde32d0b4c0e522a1f27e9e25e3ec22550920bf8f2922c669d6 not found: ID does not exist" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.312980 4823 scope.go:117] "RemoveContainer" containerID="7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.314309 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapic1ba-account-delete-sldxg"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.321927 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapic1ba-account-delete-sldxg"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.326392 4823 scope.go:117] "RemoveContainer" containerID="7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91" Dec 16 07:22:47 crc kubenswrapper[4823]: E1216 07:22:47.327441 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91\": container with ID starting with 7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91 not found: ID does not exist" containerID="7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.327480 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91"} err="failed to get container status \"7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91\": rpc error: code = NotFound desc = could not find container \"7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91\": container with ID starting with 7a23844a4049e95cfc8cfe3f25c2cd33313f835e15a5177e0274066336cddb91 not found: ID does not exist" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.327509 4823 scope.go:117] "RemoveContainer" containerID="329a04c4c9ff60b70d5395a727c15217c7dff6014bc04044a76975b137573df3" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.331774 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7vwd\" (UniqueName: \"kubernetes.io/projected/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-kube-api-access-j7vwd\") pod \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.331842 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts\") pod \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.331877 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvfw7\" (UniqueName: \"kubernetes.io/projected/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-kube-api-access-rvfw7\") pod \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\" (UID: \"dfd7efdc-36ba-4037-9f6c-a1a8c946ab33\") " Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.331969 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98cf7\" (UniqueName: \"kubernetes.io/projected/ec00a24a-8417-452e-a350-b46f36d4a84d-kube-api-access-98cf7\") pod \"ec00a24a-8417-452e-a350-b46f36d4a84d\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.332002 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts\") pod \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\" (UID: \"4a3f54ee-1dba-42f5-8697-b70de0f5b4c2\") " Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.332036 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts\") pod \"ec00a24a-8417-452e-a350-b46f36d4a84d\" (UID: \"ec00a24a-8417-452e-a350-b46f36d4a84d\") " Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.332715 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.332828 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.332855 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec00a24a-8417-452e-a350-b46f36d4a84d" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.337161 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-kube-api-access-rvfw7" (OuterVolumeSpecName: "kube-api-access-rvfw7") pod "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" (UID: "dfd7efdc-36ba-4037-9f6c-a1a8c946ab33"). InnerVolumeSpecName "kube-api-access-rvfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.337182 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-kube-api-access-j7vwd" (OuterVolumeSpecName: "kube-api-access-j7vwd") pod "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" (UID: "4a3f54ee-1dba-42f5-8697-b70de0f5b4c2"). InnerVolumeSpecName "kube-api-access-j7vwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.337266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec00a24a-8417-452e-a350-b46f36d4a84d-kube-api-access-98cf7" (OuterVolumeSpecName: "kube-api-access-98cf7") pod "ec00a24a-8417-452e-a350-b46f36d4a84d" (UID: "ec00a24a-8417-452e-a350-b46f36d4a84d"). InnerVolumeSpecName "kube-api-access-98cf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.433046 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.433078 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvfw7\" (UniqueName: \"kubernetes.io/projected/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33-kube-api-access-rvfw7\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.433091 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98cf7\" (UniqueName: \"kubernetes.io/projected/ec00a24a-8417-452e-a350-b46f36d4a84d-kube-api-access-98cf7\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.433099 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.433108 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec00a24a-8417-452e-a350-b46f36d4a84d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.433116 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7vwd\" (UniqueName: \"kubernetes.io/projected/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2-kube-api-access-j7vwd\") on node \"crc\" DevicePath \"\"" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.599705 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell06c77-account-delete-5jkkk"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.612221 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell06c77-account-delete-5jkkk"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.623836 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder1f3e-account-delete-q5pns"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.630435 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder1f3e-account-delete-q5pns"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.637997 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronba48-account-delete-87d8j"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.642853 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronba48-account-delete-87d8j"] Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.771133 4823 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poddbcff04b-7d0d-45b4-bc28-7882421c6000"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poddbcff04b-7d0d-45b4-bc28-7882421c6000] : Timed out while waiting for systemd to remove kubepods-besteffort-poddbcff04b_7d0d_45b4_bc28_7882421c6000.slice" Dec 16 07:22:47 crc kubenswrapper[4823]: E1216 07:22:47.771452 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poddbcff04b-7d0d-45b4-bc28-7882421c6000] : unable to destroy cgroup paths for cgroup [kubepods besteffort poddbcff04b-7d0d-45b4-bc28-7882421c6000] : Timed out while waiting for systemd to remove kubepods-besteffort-poddbcff04b_7d0d_45b4_bc28_7882421c6000.slice" pod="openstack/openstack-galera-0" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.779154 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" path="/var/lib/kubelet/pods/4a3f54ee-1dba-42f5-8697-b70de0f5b4c2/volumes" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.779678 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65278526-b5ee-4e40-b66b-1ee9b993f429" path="/var/lib/kubelet/pods/65278526-b5ee-4e40-b66b-1ee9b993f429/volumes" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.780150 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" path="/var/lib/kubelet/pods/dfd7efdc-36ba-4037-9f6c-a1a8c946ab33/volumes" Dec 16 07:22:47 crc kubenswrapper[4823]: I1216 07:22:47.780617 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec00a24a-8417-452e-a350-b46f36d4a84d" path="/var/lib/kubelet/pods/ec00a24a-8417-452e-a350-b46f36d4a84d/volumes" Dec 16 07:22:48 crc kubenswrapper[4823]: I1216 07:22:48.299442 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:22:48 crc kubenswrapper[4823]: I1216 07:22:48.362889 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:22:48 crc kubenswrapper[4823]: I1216 07:22:48.372809 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:22:49 crc kubenswrapper[4823]: I1216 07:22:49.785076 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" path="/var/lib/kubelet/pods/dbcff04b-7d0d-45b4-bc28-7882421c6000/volumes" Dec 16 07:22:59 crc kubenswrapper[4823]: I1216 07:22:59.772115 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:22:59 crc kubenswrapper[4823]: E1216 07:22:59.772901 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:23:10 crc kubenswrapper[4823]: I1216 07:23:10.771176 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:23:10 crc kubenswrapper[4823]: E1216 07:23:10.771954 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:23:21 crc kubenswrapper[4823]: I1216 07:23:21.775433 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:23:21 crc kubenswrapper[4823]: E1216 07:23:21.776290 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:23:32 crc kubenswrapper[4823]: I1216 07:23:32.771414 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:23:32 crc kubenswrapper[4823]: E1216 07:23:32.772468 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.249054 4823 scope.go:117] "RemoveContainer" containerID="8be0dc60e3282098ffe9292c57b0b05f6e0084ca7fc9c9c39988f91573dc30f2" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.288250 4823 scope.go:117] "RemoveContainer" containerID="982d249f69b777a6b32b89fdb716b811aa505c0b63a3d52a07e50118e0f78094" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.334586 4823 scope.go:117] "RemoveContainer" containerID="f3d1465f72d95e81aa577c6e692c288ce8109410382e3c7ab46eaaa8aa34126d" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.367658 4823 scope.go:117] "RemoveContainer" containerID="beed455aa58d1c72e6c92727c12ca8890c4f785879024962ab40d5107da8e72c" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.394294 4823 scope.go:117] "RemoveContainer" containerID="cf3000c3c19630c7b52fe3ee392070cdf29cc20770328f3c4cee0ca752e7ce59" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.418943 4823 scope.go:117] "RemoveContainer" containerID="8b55431d53902ca53d22cd6d0b32b46fec767b6a6c22ef706d3cff33e889fb4b" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.447469 4823 scope.go:117] "RemoveContainer" containerID="bd0b3284091f48885d6643e6ca71a2073437a5eb768a82f1e8e8406468a01cec" Dec 16 07:23:38 crc kubenswrapper[4823]: I1216 07:23:38.466072 4823 scope.go:117] "RemoveContainer" containerID="7629ae1e5e4c4f1f9ac4ef046163592794f31a30605927bd59270499037455b0" Dec 16 07:23:47 crc kubenswrapper[4823]: I1216 07:23:47.772175 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:23:47 crc kubenswrapper[4823]: E1216 07:23:47.772784 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:23:58 crc kubenswrapper[4823]: I1216 07:23:58.771695 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:23:58 crc kubenswrapper[4823]: E1216 07:23:58.772813 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:24:10 crc kubenswrapper[4823]: I1216 07:24:10.771191 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:24:10 crc kubenswrapper[4823]: E1216 07:24:10.771992 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:24:22 crc kubenswrapper[4823]: I1216 07:24:22.771557 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:24:22 crc kubenswrapper[4823]: E1216 07:24:22.772466 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:24:37 crc kubenswrapper[4823]: I1216 07:24:37.772121 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:24:37 crc kubenswrapper[4823]: E1216 07:24:37.772945 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.676916 4823 scope.go:117] "RemoveContainer" containerID="3d8bf1eec57a6eef7e33e2d7523f1be1a8f7b422798ea3442862f3825e3f251e" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.710776 4823 scope.go:117] "RemoveContainer" containerID="6fa7315b3a421e4a53c77f20200c6b69db9dcfebd2b05f6223c1276fbb6ac91e" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.761311 4823 scope.go:117] "RemoveContainer" containerID="0697000ce67a8cd70a4f5cf3424d1c8f80091a2488379262811ac7ef93a7f556" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.811357 4823 scope.go:117] "RemoveContainer" containerID="1b71cb799085b8870997294e24797bb13ae088e514d562be9045f395f4dd9211" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.846323 4823 scope.go:117] "RemoveContainer" containerID="6c7499272eb96a0be4f830b28f2badc678a46e4891df1fbff674a8bf16f9dc6b" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.886286 4823 scope.go:117] "RemoveContainer" containerID="cbea58da480f94db2a2ee35249963ed9ccff56507255870a694a1b9d2f6a6af6" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.908158 4823 scope.go:117] "RemoveContainer" containerID="a45c473f291f2511a351060d4ccb2b122a8889fc17f7f1e03231443022b74af9" Dec 16 07:24:38 crc kubenswrapper[4823]: I1216 07:24:38.946894 4823 scope.go:117] "RemoveContainer" containerID="fbf17c728f21d60e2722f73e8d92c8f01170959769dc3b6af1de2092502dbd5f" Dec 16 07:24:39 crc kubenswrapper[4823]: I1216 07:24:39.004513 4823 scope.go:117] "RemoveContainer" containerID="cd1dc48ce8bf98695f893a5adecf3b9f44bbc3521a4cef478e87f157c2618181" Dec 16 07:24:39 crc kubenswrapper[4823]: I1216 07:24:39.048974 4823 scope.go:117] "RemoveContainer" containerID="d1c1a2e134858b3458134e2fa1a0775f9e25a7109ded0caff60bedc00c2090bb" Dec 16 07:24:39 crc kubenswrapper[4823]: I1216 07:24:39.069732 4823 scope.go:117] "RemoveContainer" containerID="178f692ca2b4248aba6322541c8c8d404e1f3755c471036e3a5c112f6767916d" Dec 16 07:24:39 crc kubenswrapper[4823]: I1216 07:24:39.100899 4823 scope.go:117] "RemoveContainer" containerID="52aaf1d3bed1904d4e9402c6601a7b4cf9067783dfeabc37b488a2a4d81ee20f" Dec 16 07:24:39 crc kubenswrapper[4823]: I1216 07:24:39.120297 4823 scope.go:117] "RemoveContainer" containerID="fb4f73ed34378d9f3cbdb5b0ce00ba2183bba7bcb920f2a0c15d5c8a957ce220" Dec 16 07:24:48 crc kubenswrapper[4823]: I1216 07:24:48.771103 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:24:48 crc kubenswrapper[4823]: E1216 07:24:48.771579 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:25:01 crc kubenswrapper[4823]: I1216 07:25:01.775987 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:25:01 crc kubenswrapper[4823]: E1216 07:25:01.778202 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:25:12 crc kubenswrapper[4823]: I1216 07:25:12.771271 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:25:12 crc kubenswrapper[4823]: E1216 07:25:12.772149 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:25:25 crc kubenswrapper[4823]: I1216 07:25:25.772089 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:25:25 crc kubenswrapper[4823]: E1216 07:25:25.772985 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:25:38 crc kubenswrapper[4823]: I1216 07:25:38.773262 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:25:38 crc kubenswrapper[4823]: E1216 07:25:38.774758 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:25:39 crc kubenswrapper[4823]: I1216 07:25:39.306929 4823 scope.go:117] "RemoveContainer" containerID="31248dd72823a40fe4ee23b7fbaa7a419c7a61036cec2f854466ad00f8a80f4b" Dec 16 07:25:39 crc kubenswrapper[4823]: I1216 07:25:39.355355 4823 scope.go:117] "RemoveContainer" containerID="9645652666b527c7d0539b4988e942317ac4144cec04a21d624b294741e7213e" Dec 16 07:25:39 crc kubenswrapper[4823]: I1216 07:25:39.389825 4823 scope.go:117] "RemoveContainer" containerID="7e146bbc79bbe9eb68a312975f35b67d90430fd0786d945860fd8ab6a984eb53" Dec 16 07:25:39 crc kubenswrapper[4823]: I1216 07:25:39.411293 4823 scope.go:117] "RemoveContainer" containerID="40d274525e2d9a0c414e988dd7895e2962042be8fc3f624f35efa6cd9aab456e" Dec 16 07:25:39 crc kubenswrapper[4823]: I1216 07:25:39.434602 4823 scope.go:117] "RemoveContainer" containerID="66fb8cb9dc2bdcafdd3f90d87593590c0679946f3ccdc9113b87fb499e690755" Dec 16 07:25:51 crc kubenswrapper[4823]: I1216 07:25:51.776749 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:25:51 crc kubenswrapper[4823]: E1216 07:25:51.778881 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:26:05 crc kubenswrapper[4823]: I1216 07:26:05.772256 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:26:06 crc kubenswrapper[4823]: I1216 07:26:06.124913 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"09163b4ab4e9994c64567f3f3aa8f5c17c63088f1a3e3778b97b8c712dcc34f7"} Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.533768 4823 scope.go:117] "RemoveContainer" containerID="07a9d0c25e6eeab6239ccf65db9c887bfc778f97b4b49626f4c06ae1fafb22b5" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.586388 4823 scope.go:117] "RemoveContainer" containerID="16a95f2b7fef319066cab4f1313e0adcc10ee53441d3966332ef54251e1bdd00" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.615164 4823 scope.go:117] "RemoveContainer" containerID="ac51929429ff0dc494a7f7bf47c48ab01b7d2dee92724a05c9ab6ca16fab3c14" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.644793 4823 scope.go:117] "RemoveContainer" containerID="8c0b7b84afb786dddbed1b11c3ecd557ca15d8f1a49d5fef28e57f3520fd1ec6" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.693633 4823 scope.go:117] "RemoveContainer" containerID="ed8da51b1f401f8f4e2d7a7d7452b6f625aac4f2e92ebf487be96e296cef532b" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.717544 4823 scope.go:117] "RemoveContainer" containerID="04cfc7ac2370fcc670b0a4d36151d595decad0d56f5fab56594a90ea3fd9eb05" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.741241 4823 scope.go:117] "RemoveContainer" containerID="c415931a0be8201cf5c9581bbc5fd4c823fe4b93ff209f48cd059567ea181a32" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.787319 4823 scope.go:117] "RemoveContainer" containerID="77d605fe574dc720c6f8c4e19ea7723f3ce2ff5404d8309c4b773329eab3bced" Dec 16 07:26:39 crc kubenswrapper[4823]: I1216 07:26:39.804917 4823 scope.go:117] "RemoveContainer" containerID="33c1c84e6505bf5e60cb15c74c9530a062509d13ccde74bcf09a73dbf725eeee" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.538817 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jp62m"] Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539554 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-updater" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539573 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-updater" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539629 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerName="nova-scheduler-scheduler" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539638 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerName="nova-scheduler-scheduler" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539658 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="setup-container" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539667 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="setup-container" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539679 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-metadata" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539686 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-metadata" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539698 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a88b40-28bf-4b43-bed8-0b3df3baec5c" containerName="keystone-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539704 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a88b40-28bf-4b43-bed8-0b3df3baec5c" containerName="keystone-api" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539722 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bed5482-3232-4318-b8a0-dcfd51d8611b" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539730 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bed5482-3232-4318-b8a0-dcfd51d8611b" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539740 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539749 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-server" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539759 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539765 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539779 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539787 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539796 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="sg-core" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539803 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="sg-core" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539814 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539823 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539831 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539838 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539853 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539860 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539871 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="proxy-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539878 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="proxy-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539886 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="rabbitmq" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539894 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="rabbitmq" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539909 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="ovsdbserver-sb" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539922 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="ovsdbserver-sb" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539933 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539941 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539955 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9f0e08-d61e-4503-afc5-09cb29ff3175" containerName="kube-state-metrics" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539962 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9f0e08-d61e-4503-afc5-09cb29ff3175" containerName="kube-state-metrics" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539975 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.539983 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-server" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.539996 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540003 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540015 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="swift-recon-cron" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540063 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="swift-recon-cron" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540073 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540081 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540092 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540099 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540113 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540120 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540145 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540153 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540165 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="setup-container" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540171 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="setup-container" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540183 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540189 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540199 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540205 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540214 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540222 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540235 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="init" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540243 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="init" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540254 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerName="mysql-bootstrap" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540263 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerName="mysql-bootstrap" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540274 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540282 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540296 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerName="mysql-bootstrap" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540303 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerName="mysql-bootstrap" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540311 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540319 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540333 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540340 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540349 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540356 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540365 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerName="galera" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540372 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerName="galera" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540381 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-central-agent" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540388 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-central-agent" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540396 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362dcfe9-8417-425b-8eab-8bd39bf661fc" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540404 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="362dcfe9-8417-425b-8eab-8bd39bf661fc" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540414 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerName="galera" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540421 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerName="galera" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540430 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" containerName="memcached" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540437 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" containerName="memcached" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540444 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540451 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-api" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540462 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb285b0-26ce-494d-9d69-8fe905e39469" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540481 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb285b0-26ce-494d-9d69-8fe905e39469" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540493 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server-init" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540500 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server-init" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540510 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="probe" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540518 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="probe" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540529 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="rsync" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540539 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="rsync" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540548 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-reaper" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540556 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-reaper" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540566 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540573 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-server" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540586 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-expirer" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540593 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-expirer" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540603 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540611 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540624 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540631 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540641 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540649 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540663 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" containerName="nova-cell0-conductor-conductor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540671 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" containerName="nova-cell0-conductor-conductor" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540680 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-notification-agent" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540688 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-notification-agent" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540698 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540705 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540718 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540726 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540738 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="ovn-northd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540746 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="ovn-northd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540755 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540763 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540775 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540782 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540794 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540802 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540813 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540831 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540844 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec00a24a-8417-452e-a350-b46f36d4a84d" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540852 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec00a24a-8417-452e-a350-b46f36d4a84d" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540861 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540869 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540880 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-updater" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540888 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-updater" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540914 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" containerName="nova-cell1-conductor-conductor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540922 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" containerName="nova-cell1-conductor-conductor" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540934 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540943 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-api" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540955 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540964 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.540983 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="ovsdbserver-nb" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.540991 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="ovsdbserver-nb" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541000 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541008 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541018 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="rabbitmq" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541046 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="rabbitmq" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541058 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541066 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-api" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541077 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541083 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541095 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="cinder-scheduler" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541103 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="cinder-scheduler" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541111 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65278526-b5ee-4e40-b66b-1ee9b993f429" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541118 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="65278526-b5ee-4e40-b66b-1ee9b993f429" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541127 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541134 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-log" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541142 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="dnsmasq-dns" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541149 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="dnsmasq-dns" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541160 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541167 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-server" Dec 16 07:26:47 crc kubenswrapper[4823]: E1216 07:26:47.541176 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541184 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541375 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="362dcfe9-8417-425b-8eab-8bd39bf661fc" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541403 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbcff04b-7d0d-45b4-bc28-7882421c6000" containerName="galera" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541421 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541433 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-expirer" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541451 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe17b2e-19bd-430b-8cb5-147ed1d2ffb6" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541463 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="rsync" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541477 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541585 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541609 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18c5d6c-3429-4aa3-b933-85176e0e5ece" containerName="nova-scheduler-scheduler" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541624 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9f0e08-d61e-4503-afc5-09cb29ff3175" containerName="kube-state-metrics" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541634 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbee1863-ef4e-4d0a-aca7-f7c09e3f0a50" containerName="glance-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541647 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb273dd-7ca8-42e5-97b3-d3ea5c4010e1" containerName="rabbitmq" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541659 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541669 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-updater" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541677 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd7efdc-36ba-4037-9f6c-a1a8c946ab33" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541687 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="cinder-scheduler" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541696 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe879e4-70bf-4f38-a4a7-98f5eb23a769" containerName="ovn-controller" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541703 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541715 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541722 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541734 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="swift-recon-cron" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541741 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a24114-2ee1-4cc0-9045-770fcf074950" containerName="nova-cell1-conductor-conductor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541752 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541768 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="65278526-b5ee-4e40-b66b-1ee9b993f429" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541775 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27cd126-6c5b-4e95-b313-0bb19568f42a" containerName="probe" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541784 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541810 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541819 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541827 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541838 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-reaper" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541847 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541859 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="196356f3-e866-4cf1-b3e8-eba3d9e4c99f" containerName="placement-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541867 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="proxy-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541875 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-httpd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541882 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovsdb-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541894 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a686a945-8fa0-406c-ac01-cf061c865a28" containerName="rabbitmq" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541901 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb285b0-26ce-494d-9d69-8fe905e39469" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541909 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="ovn-northd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541918 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="22db0f3f-88b5-4909-aa80-f4b020d1ce18" containerName="barbican-worker" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541924 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d6c697-a49c-4919-81b5-6899a080d06b" containerName="nova-api-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541932 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541939 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad8e2a2-14c6-45b5-86f3-e4765cddd777" containerName="nova-cell0-conductor-conductor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541946 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="container-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541956 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541963 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-metadata" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541986 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="object-updater" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.541996 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a88b40-28bf-4b43-bed8-0b3df3baec5c" containerName="keystone-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542004 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-notification-agent" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542015 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b4a0e9-2642-4fc1-b6fc-f5a0367a34ab" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542042 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542053 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edb9072-dfce-44ca-88d3-64136ac7e1c3" containerName="ovs-vswitchd" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542061 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a3f54ee-1dba-42f5-8697-b70de0f5b4c2" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542073 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cd2ecc-0f95-47b9-b43e-5c36fa6a33b6" containerName="barbican-keystone-listener" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542085 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="ceilometer-central-agent" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542095 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542103 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="603d469a-39a2-4d84-87cb-f2c7499b7a28" containerName="ovsdbserver-sb" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542112 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e933eb-7294-47b8-af0c-fbb03725d3d8" containerName="sg-core" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542119 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfde95a-b68d-4aee-9302-a81c73eafa99" containerName="proxy-server" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542131 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eee92de-9c0e-4afd-8a27-52d82caa27ad" containerName="memcached" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542140 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b566f9ee-8a75-4041-aac4-1573ca610541" containerName="ovsdbserver-nb" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542150 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-auditor" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542161 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd02f05-0804-48c6-b9b4-cda88fd6b14a" containerName="openstack-network-exporter" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542169 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a2fe80-7cf2-4419-91c9-3c958d33d5a8" containerName="galera" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542179 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4795acd-bc9b-4c2c-aaa2-feb41c3c491f" containerName="dnsmasq-dns" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542189 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec00a24a-8417-452e-a350-b46f36d4a84d" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542199 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7377ca-c7ab-4ee0-ae2a-5fbb782ba925" containerName="glance-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542211 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c559ee21-de8f-44a1-998a-cb0b4aff8cd7" containerName="barbican-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542219 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2faec4-82e9-409b-a6c1-93f8cd78b9ec" containerName="nova-metadata-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542228 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37eade87-02f6-4584-87d3-9b22e16ad915" containerName="account-replicator" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542236 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b1ed60-7cb0-48f0-aebf-3de778dbb95b" containerName="neutron-api" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542245 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="17cbb31a-6067-4925-ba57-956baf53ce8b" containerName="cinder-api-log" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.542254 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bed5482-3232-4318-b8a0-dcfd51d8611b" containerName="mariadb-account-delete" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.543619 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.553529 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jp62m"] Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.719330 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-catalog-content\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.719968 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-utilities\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.720195 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmbd\" (UniqueName: \"kubernetes.io/projected/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-kube-api-access-smmbd\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.821996 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-catalog-content\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.822318 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-utilities\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.822539 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmbd\" (UniqueName: \"kubernetes.io/projected/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-kube-api-access-smmbd\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.822551 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-catalog-content\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.822746 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-utilities\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.844211 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmbd\" (UniqueName: \"kubernetes.io/projected/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-kube-api-access-smmbd\") pod \"redhat-operators-jp62m\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:47 crc kubenswrapper[4823]: I1216 07:26:47.877765 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:48 crc kubenswrapper[4823]: I1216 07:26:48.355969 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jp62m"] Dec 16 07:26:48 crc kubenswrapper[4823]: I1216 07:26:48.478437 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerStarted","Data":"3afe9a7110f8e72844243b7792d4d26c8715c2f234e166a868e42e4e1bd2c466"} Dec 16 07:26:49 crc kubenswrapper[4823]: I1216 07:26:49.487633 4823 generic.go:334] "Generic (PLEG): container finished" podID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerID="b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539" exitCode=0 Dec 16 07:26:49 crc kubenswrapper[4823]: I1216 07:26:49.487692 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerDied","Data":"b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539"} Dec 16 07:26:49 crc kubenswrapper[4823]: I1216 07:26:49.489938 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:26:51 crc kubenswrapper[4823]: I1216 07:26:51.508556 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerStarted","Data":"95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4"} Dec 16 07:26:52 crc kubenswrapper[4823]: I1216 07:26:52.519546 4823 generic.go:334] "Generic (PLEG): container finished" podID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerID="95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4" exitCode=0 Dec 16 07:26:52 crc kubenswrapper[4823]: I1216 07:26:52.519600 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerDied","Data":"95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4"} Dec 16 07:26:53 crc kubenswrapper[4823]: I1216 07:26:53.527899 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerStarted","Data":"ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658"} Dec 16 07:26:53 crc kubenswrapper[4823]: I1216 07:26:53.554320 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jp62m" podStartSLOduration=2.970048453 podStartE2EDuration="6.554300222s" podCreationTimestamp="2025-12-16 07:26:47 +0000 UTC" firstStartedPulling="2025-12-16 07:26:49.489749022 +0000 UTC m=+1887.978315145" lastFinishedPulling="2025-12-16 07:26:53.074000781 +0000 UTC m=+1891.562566914" observedRunningTime="2025-12-16 07:26:53.551218985 +0000 UTC m=+1892.039785118" watchObservedRunningTime="2025-12-16 07:26:53.554300222 +0000 UTC m=+1892.042866365" Dec 16 07:26:57 crc kubenswrapper[4823]: I1216 07:26:57.878584 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:57 crc kubenswrapper[4823]: I1216 07:26:57.879107 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:26:58 crc kubenswrapper[4823]: I1216 07:26:58.938922 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jp62m" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="registry-server" probeResult="failure" output=< Dec 16 07:26:58 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 07:26:58 crc kubenswrapper[4823]: > Dec 16 07:27:07 crc kubenswrapper[4823]: I1216 07:27:07.924835 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:27:07 crc kubenswrapper[4823]: I1216 07:27:07.969842 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:27:08 crc kubenswrapper[4823]: I1216 07:27:08.162235 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jp62m"] Dec 16 07:27:09 crc kubenswrapper[4823]: I1216 07:27:09.736472 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jp62m" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="registry-server" containerID="cri-o://ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658" gracePeriod=2 Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.214445 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.358649 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-catalog-content\") pod \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.358739 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmbd\" (UniqueName: \"kubernetes.io/projected/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-kube-api-access-smmbd\") pod \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.358994 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-utilities\") pod \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\" (UID: \"392c68ba-f5dc-4cb0-9019-586aba3c8ee8\") " Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.360053 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-utilities" (OuterVolumeSpecName: "utilities") pod "392c68ba-f5dc-4cb0-9019-586aba3c8ee8" (UID: "392c68ba-f5dc-4cb0-9019-586aba3c8ee8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.364933 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-kube-api-access-smmbd" (OuterVolumeSpecName: "kube-api-access-smmbd") pod "392c68ba-f5dc-4cb0-9019-586aba3c8ee8" (UID: "392c68ba-f5dc-4cb0-9019-586aba3c8ee8"). InnerVolumeSpecName "kube-api-access-smmbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.461242 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.461294 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smmbd\" (UniqueName: \"kubernetes.io/projected/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-kube-api-access-smmbd\") on node \"crc\" DevicePath \"\"" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.515394 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "392c68ba-f5dc-4cb0-9019-586aba3c8ee8" (UID: "392c68ba-f5dc-4cb0-9019-586aba3c8ee8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.566811 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/392c68ba-f5dc-4cb0-9019-586aba3c8ee8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.745643 4823 generic.go:334] "Generic (PLEG): container finished" podID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerID="ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658" exitCode=0 Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.745736 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerDied","Data":"ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658"} Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.745769 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jp62m" event={"ID":"392c68ba-f5dc-4cb0-9019-586aba3c8ee8","Type":"ContainerDied","Data":"3afe9a7110f8e72844243b7792d4d26c8715c2f234e166a868e42e4e1bd2c466"} Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.745791 4823 scope.go:117] "RemoveContainer" containerID="ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.746059 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jp62m" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.782155 4823 scope.go:117] "RemoveContainer" containerID="95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.790265 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jp62m"] Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.797682 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jp62m"] Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.808360 4823 scope.go:117] "RemoveContainer" containerID="b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.843575 4823 scope.go:117] "RemoveContainer" containerID="ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658" Dec 16 07:27:10 crc kubenswrapper[4823]: E1216 07:27:10.844103 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658\": container with ID starting with ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658 not found: ID does not exist" containerID="ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.844149 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658"} err="failed to get container status \"ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658\": rpc error: code = NotFound desc = could not find container \"ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658\": container with ID starting with ad9ad41b01cc729033d91996ad7e658ed0c8909853bba571b879f4109487e658 not found: ID does not exist" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.844176 4823 scope.go:117] "RemoveContainer" containerID="95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4" Dec 16 07:27:10 crc kubenswrapper[4823]: E1216 07:27:10.844470 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4\": container with ID starting with 95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4 not found: ID does not exist" containerID="95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.844492 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4"} err="failed to get container status \"95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4\": rpc error: code = NotFound desc = could not find container \"95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4\": container with ID starting with 95290cb2bfd64ad86de6b5c1927d079acbcc7f862d565f0ac434faccd17556c4 not found: ID does not exist" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.844507 4823 scope.go:117] "RemoveContainer" containerID="b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539" Dec 16 07:27:10 crc kubenswrapper[4823]: E1216 07:27:10.844725 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539\": container with ID starting with b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539 not found: ID does not exist" containerID="b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539" Dec 16 07:27:10 crc kubenswrapper[4823]: I1216 07:27:10.844755 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539"} err="failed to get container status \"b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539\": rpc error: code = NotFound desc = could not find container \"b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539\": container with ID starting with b2a0fd5333fabaf78d857ee066c09bae1eebb927090fcc1bab55ef3119361539 not found: ID does not exist" Dec 16 07:27:11 crc kubenswrapper[4823]: I1216 07:27:11.792497 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" path="/var/lib/kubelet/pods/392c68ba-f5dc-4cb0-9019-586aba3c8ee8/volumes" Dec 16 07:27:39 crc kubenswrapper[4823]: I1216 07:27:39.948204 4823 scope.go:117] "RemoveContainer" containerID="3f783c5a727d05736908b2e0ecd933e245b110228a42ec9667cdc218c6a08477" Dec 16 07:28:28 crc kubenswrapper[4823]: I1216 07:28:28.133676 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:28:28 crc kubenswrapper[4823]: I1216 07:28:28.134350 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:28:40 crc kubenswrapper[4823]: I1216 07:28:40.073132 4823 scope.go:117] "RemoveContainer" containerID="8e4f40865e8b5b6f7f423d358797fe6b396dfe5efe196d02239ab7408b314c84" Dec 16 07:28:40 crc kubenswrapper[4823]: I1216 07:28:40.097181 4823 scope.go:117] "RemoveContainer" containerID="72c14dbead3689fee64d41c987422c77b20738b49f095e2e65138aaa6b36bf8d" Dec 16 07:28:40 crc kubenswrapper[4823]: I1216 07:28:40.131936 4823 scope.go:117] "RemoveContainer" containerID="f07f968d96b80dfc9117a3c0164dea559bbcdd8fa65d00f555226983c461f302" Dec 16 07:28:58 crc kubenswrapper[4823]: I1216 07:28:58.133792 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:28:58 crc kubenswrapper[4823]: I1216 07:28:58.135487 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.133912 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.134563 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.134620 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.135330 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09163b4ab4e9994c64567f3f3aa8f5c17c63088f1a3e3778b97b8c712dcc34f7"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.135388 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://09163b4ab4e9994c64567f3f3aa8f5c17c63088f1a3e3778b97b8c712dcc34f7" gracePeriod=600 Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.322472 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="09163b4ab4e9994c64567f3f3aa8f5c17c63088f1a3e3778b97b8c712dcc34f7" exitCode=0 Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.322540 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"09163b4ab4e9994c64567f3f3aa8f5c17c63088f1a3e3778b97b8c712dcc34f7"} Dec 16 07:29:28 crc kubenswrapper[4823]: I1216 07:29:28.322630 4823 scope.go:117] "RemoveContainer" containerID="37b5da4c3e0632087412acf947c72a2aad7577385641e763185ee25747c43921" Dec 16 07:29:29 crc kubenswrapper[4823]: I1216 07:29:29.337981 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3"} Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.158964 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292"] Dec 16 07:30:00 crc kubenswrapper[4823]: E1216 07:30:00.159824 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="extract-content" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.159838 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="extract-content" Dec 16 07:30:00 crc kubenswrapper[4823]: E1216 07:30:00.159852 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="extract-utilities" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.159860 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="extract-utilities" Dec 16 07:30:00 crc kubenswrapper[4823]: E1216 07:30:00.159874 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="registry-server" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.159882 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="registry-server" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.160055 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="392c68ba-f5dc-4cb0-9019-586aba3c8ee8" containerName="registry-server" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.160545 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.163228 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.164014 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.172490 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292"] Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.274199 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1f858-3be7-4e76-99be-0eda5f3f7595-config-volume\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.274262 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjgcn\" (UniqueName: \"kubernetes.io/projected/97d1f858-3be7-4e76-99be-0eda5f3f7595-kube-api-access-sjgcn\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.274295 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1f858-3be7-4e76-99be-0eda5f3f7595-secret-volume\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.376135 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1f858-3be7-4e76-99be-0eda5f3f7595-config-volume\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.376215 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjgcn\" (UniqueName: \"kubernetes.io/projected/97d1f858-3be7-4e76-99be-0eda5f3f7595-kube-api-access-sjgcn\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.376297 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1f858-3be7-4e76-99be-0eda5f3f7595-secret-volume\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.377364 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1f858-3be7-4e76-99be-0eda5f3f7595-config-volume\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.385476 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1f858-3be7-4e76-99be-0eda5f3f7595-secret-volume\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.396703 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjgcn\" (UniqueName: \"kubernetes.io/projected/97d1f858-3be7-4e76-99be-0eda5f3f7595-kube-api-access-sjgcn\") pod \"collect-profiles-29431170-ng292\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.487079 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:00 crc kubenswrapper[4823]: I1216 07:30:00.930706 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292"] Dec 16 07:30:01 crc kubenswrapper[4823]: I1216 07:30:01.662673 4823 generic.go:334] "Generic (PLEG): container finished" podID="97d1f858-3be7-4e76-99be-0eda5f3f7595" containerID="6b8d8117e276881284b088bf5e8d963380dfc7fc2c607547925104130ea3d392" exitCode=0 Dec 16 07:30:01 crc kubenswrapper[4823]: I1216 07:30:01.662774 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" event={"ID":"97d1f858-3be7-4e76-99be-0eda5f3f7595","Type":"ContainerDied","Data":"6b8d8117e276881284b088bf5e8d963380dfc7fc2c607547925104130ea3d392"} Dec 16 07:30:01 crc kubenswrapper[4823]: I1216 07:30:01.663034 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" event={"ID":"97d1f858-3be7-4e76-99be-0eda5f3f7595","Type":"ContainerStarted","Data":"f29e9b08bb5b67fbe2afe6fc221a4e1e0baff19be0dc8779c14c307d9ea2e362"} Dec 16 07:30:02 crc kubenswrapper[4823]: I1216 07:30:02.959641 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.013303 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1f858-3be7-4e76-99be-0eda5f3f7595-secret-volume\") pod \"97d1f858-3be7-4e76-99be-0eda5f3f7595\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.013612 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1f858-3be7-4e76-99be-0eda5f3f7595-config-volume\") pod \"97d1f858-3be7-4e76-99be-0eda5f3f7595\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.013773 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjgcn\" (UniqueName: \"kubernetes.io/projected/97d1f858-3be7-4e76-99be-0eda5f3f7595-kube-api-access-sjgcn\") pod \"97d1f858-3be7-4e76-99be-0eda5f3f7595\" (UID: \"97d1f858-3be7-4e76-99be-0eda5f3f7595\") " Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.014106 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d1f858-3be7-4e76-99be-0eda5f3f7595-config-volume" (OuterVolumeSpecName: "config-volume") pod "97d1f858-3be7-4e76-99be-0eda5f3f7595" (UID: "97d1f858-3be7-4e76-99be-0eda5f3f7595"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.014275 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1f858-3be7-4e76-99be-0eda5f3f7595-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.019385 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d1f858-3be7-4e76-99be-0eda5f3f7595-kube-api-access-sjgcn" (OuterVolumeSpecName: "kube-api-access-sjgcn") pod "97d1f858-3be7-4e76-99be-0eda5f3f7595" (UID: "97d1f858-3be7-4e76-99be-0eda5f3f7595"). InnerVolumeSpecName "kube-api-access-sjgcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.019909 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1f858-3be7-4e76-99be-0eda5f3f7595-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97d1f858-3be7-4e76-99be-0eda5f3f7595" (UID: "97d1f858-3be7-4e76-99be-0eda5f3f7595"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.115331 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjgcn\" (UniqueName: \"kubernetes.io/projected/97d1f858-3be7-4e76-99be-0eda5f3f7595-kube-api-access-sjgcn\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.115597 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1f858-3be7-4e76-99be-0eda5f3f7595-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.678939 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" event={"ID":"97d1f858-3be7-4e76-99be-0eda5f3f7595","Type":"ContainerDied","Data":"f29e9b08bb5b67fbe2afe6fc221a4e1e0baff19be0dc8779c14c307d9ea2e362"} Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.678980 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f29e9b08bb5b67fbe2afe6fc221a4e1e0baff19be0dc8779c14c307d9ea2e362" Dec 16 07:30:03 crc kubenswrapper[4823]: I1216 07:30:03.679054 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292" Dec 16 07:30:04 crc kubenswrapper[4823]: I1216 07:30:04.044747 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x"] Dec 16 07:30:04 crc kubenswrapper[4823]: I1216 07:30:04.050410 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-j4w8x"] Dec 16 07:30:05 crc kubenswrapper[4823]: I1216 07:30:05.783804 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b1d27b-235a-4b1e-adaa-512f3ae25954" path="/var/lib/kubelet/pods/b6b1d27b-235a-4b1e-adaa-512f3ae25954/volumes" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.538393 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h6qnj"] Dec 16 07:30:20 crc kubenswrapper[4823]: E1216 07:30:20.540268 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d1f858-3be7-4e76-99be-0eda5f3f7595" containerName="collect-profiles" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.540362 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d1f858-3be7-4e76-99be-0eda5f3f7595" containerName="collect-profiles" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.540604 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d1f858-3be7-4e76-99be-0eda5f3f7595" containerName="collect-profiles" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.541833 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.557778 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h6qnj"] Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.673843 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rcwj\" (UniqueName: \"kubernetes.io/projected/d4bdcef7-298c-468e-8117-80d07a192233-kube-api-access-2rcwj\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.674286 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-utilities\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.674523 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-catalog-content\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.775598 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-utilities\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.775682 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-catalog-content\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.775733 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rcwj\" (UniqueName: \"kubernetes.io/projected/d4bdcef7-298c-468e-8117-80d07a192233-kube-api-access-2rcwj\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.776767 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-utilities\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.777054 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-catalog-content\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.794139 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rcwj\" (UniqueName: \"kubernetes.io/projected/d4bdcef7-298c-468e-8117-80d07a192233-kube-api-access-2rcwj\") pod \"certified-operators-h6qnj\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:20 crc kubenswrapper[4823]: I1216 07:30:20.872951 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:21 crc kubenswrapper[4823]: I1216 07:30:21.330183 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h6qnj"] Dec 16 07:30:21 crc kubenswrapper[4823]: I1216 07:30:21.808462 4823 generic.go:334] "Generic (PLEG): container finished" podID="d4bdcef7-298c-468e-8117-80d07a192233" containerID="fb0bec15cddbcd02e1e9228d8e37a0615943f7a506f936b83e7b40bf41eacb80" exitCode=0 Dec 16 07:30:21 crc kubenswrapper[4823]: I1216 07:30:21.808514 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerDied","Data":"fb0bec15cddbcd02e1e9228d8e37a0615943f7a506f936b83e7b40bf41eacb80"} Dec 16 07:30:21 crc kubenswrapper[4823]: I1216 07:30:21.808769 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerStarted","Data":"cbf60082d401c4a9ac3bcf660d02fae9ee5c9aac160fb1240025bb54bdd04977"} Dec 16 07:30:22 crc kubenswrapper[4823]: I1216 07:30:22.816743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerStarted","Data":"e0521b27943514a2a13223fd98fecf7885653b35b80cf52b890467c6cf14612a"} Dec 16 07:30:23 crc kubenswrapper[4823]: I1216 07:30:23.827932 4823 generic.go:334] "Generic (PLEG): container finished" podID="d4bdcef7-298c-468e-8117-80d07a192233" containerID="e0521b27943514a2a13223fd98fecf7885653b35b80cf52b890467c6cf14612a" exitCode=0 Dec 16 07:30:23 crc kubenswrapper[4823]: I1216 07:30:23.828017 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerDied","Data":"e0521b27943514a2a13223fd98fecf7885653b35b80cf52b890467c6cf14612a"} Dec 16 07:30:24 crc kubenswrapper[4823]: I1216 07:30:24.836678 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerStarted","Data":"94f151dac0a32d13d4eb828bf33c3d055b99be4e8d6ec46b75a5f6377f9bf082"} Dec 16 07:30:24 crc kubenswrapper[4823]: I1216 07:30:24.869169 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h6qnj" podStartSLOduration=2.414951688 podStartE2EDuration="4.869143335s" podCreationTimestamp="2025-12-16 07:30:20 +0000 UTC" firstStartedPulling="2025-12-16 07:30:21.81186858 +0000 UTC m=+2100.300434713" lastFinishedPulling="2025-12-16 07:30:24.266060237 +0000 UTC m=+2102.754626360" observedRunningTime="2025-12-16 07:30:24.856116427 +0000 UTC m=+2103.344682640" watchObservedRunningTime="2025-12-16 07:30:24.869143335 +0000 UTC m=+2103.357709498" Dec 16 07:30:30 crc kubenswrapper[4823]: I1216 07:30:30.873821 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:30 crc kubenswrapper[4823]: I1216 07:30:30.874460 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:30 crc kubenswrapper[4823]: I1216 07:30:30.919334 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:30 crc kubenswrapper[4823]: I1216 07:30:30.972609 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:31 crc kubenswrapper[4823]: I1216 07:30:31.151880 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h6qnj"] Dec 16 07:30:32 crc kubenswrapper[4823]: I1216 07:30:32.901969 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h6qnj" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="registry-server" containerID="cri-o://94f151dac0a32d13d4eb828bf33c3d055b99be4e8d6ec46b75a5f6377f9bf082" gracePeriod=2 Dec 16 07:30:33 crc kubenswrapper[4823]: I1216 07:30:33.911356 4823 generic.go:334] "Generic (PLEG): container finished" podID="d4bdcef7-298c-468e-8117-80d07a192233" containerID="94f151dac0a32d13d4eb828bf33c3d055b99be4e8d6ec46b75a5f6377f9bf082" exitCode=0 Dec 16 07:30:33 crc kubenswrapper[4823]: I1216 07:30:33.911579 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerDied","Data":"94f151dac0a32d13d4eb828bf33c3d055b99be4e8d6ec46b75a5f6377f9bf082"} Dec 16 07:30:33 crc kubenswrapper[4823]: I1216 07:30:33.912694 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h6qnj" event={"ID":"d4bdcef7-298c-468e-8117-80d07a192233","Type":"ContainerDied","Data":"cbf60082d401c4a9ac3bcf660d02fae9ee5c9aac160fb1240025bb54bdd04977"} Dec 16 07:30:33 crc kubenswrapper[4823]: I1216 07:30:33.912781 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbf60082d401c4a9ac3bcf660d02fae9ee5c9aac160fb1240025bb54bdd04977" Dec 16 07:30:33 crc kubenswrapper[4823]: I1216 07:30:33.932619 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.110138 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rcwj\" (UniqueName: \"kubernetes.io/projected/d4bdcef7-298c-468e-8117-80d07a192233-kube-api-access-2rcwj\") pod \"d4bdcef7-298c-468e-8117-80d07a192233\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.110557 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-utilities\") pod \"d4bdcef7-298c-468e-8117-80d07a192233\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.110808 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-catalog-content\") pod \"d4bdcef7-298c-468e-8117-80d07a192233\" (UID: \"d4bdcef7-298c-468e-8117-80d07a192233\") " Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.111581 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-utilities" (OuterVolumeSpecName: "utilities") pod "d4bdcef7-298c-468e-8117-80d07a192233" (UID: "d4bdcef7-298c-468e-8117-80d07a192233"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.117875 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4bdcef7-298c-468e-8117-80d07a192233-kube-api-access-2rcwj" (OuterVolumeSpecName: "kube-api-access-2rcwj") pod "d4bdcef7-298c-468e-8117-80d07a192233" (UID: "d4bdcef7-298c-468e-8117-80d07a192233"). InnerVolumeSpecName "kube-api-access-2rcwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.193869 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4bdcef7-298c-468e-8117-80d07a192233" (UID: "d4bdcef7-298c-468e-8117-80d07a192233"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.212667 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.212709 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rcwj\" (UniqueName: \"kubernetes.io/projected/d4bdcef7-298c-468e-8117-80d07a192233-kube-api-access-2rcwj\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.212725 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4bdcef7-298c-468e-8117-80d07a192233-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.920597 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h6qnj" Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.964088 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h6qnj"] Dec 16 07:30:34 crc kubenswrapper[4823]: I1216 07:30:34.975093 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h6qnj"] Dec 16 07:30:35 crc kubenswrapper[4823]: I1216 07:30:35.786230 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4bdcef7-298c-468e-8117-80d07a192233" path="/var/lib/kubelet/pods/d4bdcef7-298c-468e-8117-80d07a192233/volumes" Dec 16 07:30:40 crc kubenswrapper[4823]: I1216 07:30:40.211085 4823 scope.go:117] "RemoveContainer" containerID="0f59ba2538eb734d4a4b11eddc447e57ec3828b2088c0ac2a4cc3536f3e69b67" Dec 16 07:31:28 crc kubenswrapper[4823]: I1216 07:31:28.134188 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:31:28 crc kubenswrapper[4823]: I1216 07:31:28.135058 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.717384 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9dv6w"] Dec 16 07:31:37 crc kubenswrapper[4823]: E1216 07:31:37.720371 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="extract-content" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.720505 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="extract-content" Dec 16 07:31:37 crc kubenswrapper[4823]: E1216 07:31:37.720592 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="registry-server" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.720702 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="registry-server" Dec 16 07:31:37 crc kubenswrapper[4823]: E1216 07:31:37.720809 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="extract-utilities" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.720889 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="extract-utilities" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.721170 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4bdcef7-298c-468e-8117-80d07a192233" containerName="registry-server" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.723389 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.733263 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dv6w"] Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.868592 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-catalog-content\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.869234 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg9q9\" (UniqueName: \"kubernetes.io/projected/71303f90-3f56-4734-978e-9c4575332704-kube-api-access-xg9q9\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.869648 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-utilities\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.970937 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg9q9\" (UniqueName: \"kubernetes.io/projected/71303f90-3f56-4734-978e-9c4575332704-kube-api-access-xg9q9\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.970997 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-utilities\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.971167 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-catalog-content\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.971917 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-utilities\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:37 crc kubenswrapper[4823]: I1216 07:31:37.972014 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-catalog-content\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:38 crc kubenswrapper[4823]: I1216 07:31:38.000891 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg9q9\" (UniqueName: \"kubernetes.io/projected/71303f90-3f56-4734-978e-9c4575332704-kube-api-access-xg9q9\") pod \"redhat-marketplace-9dv6w\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:38 crc kubenswrapper[4823]: I1216 07:31:38.046737 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:38 crc kubenswrapper[4823]: I1216 07:31:38.554591 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dv6w"] Dec 16 07:31:39 crc kubenswrapper[4823]: I1216 07:31:39.482984 4823 generic.go:334] "Generic (PLEG): container finished" podID="71303f90-3f56-4734-978e-9c4575332704" containerID="f390b53e98e8d2f210b32c3e34b8840457028803768337979080cfd698c49b5d" exitCode=0 Dec 16 07:31:39 crc kubenswrapper[4823]: I1216 07:31:39.483101 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerDied","Data":"f390b53e98e8d2f210b32c3e34b8840457028803768337979080cfd698c49b5d"} Dec 16 07:31:39 crc kubenswrapper[4823]: I1216 07:31:39.484335 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerStarted","Data":"50a1d7810cd28a6a54eb3191e0ad47f6e0e20b39bc0a670ae2e1b4aebb19e8ff"} Dec 16 07:31:40 crc kubenswrapper[4823]: I1216 07:31:40.495511 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerStarted","Data":"a1ee8c3c2dddcfc4d95ec69f0290f66ff8b2f47b3344522bd8a3649f120d4ca2"} Dec 16 07:31:41 crc kubenswrapper[4823]: I1216 07:31:41.505078 4823 generic.go:334] "Generic (PLEG): container finished" podID="71303f90-3f56-4734-978e-9c4575332704" containerID="a1ee8c3c2dddcfc4d95ec69f0290f66ff8b2f47b3344522bd8a3649f120d4ca2" exitCode=0 Dec 16 07:31:41 crc kubenswrapper[4823]: I1216 07:31:41.505118 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerDied","Data":"a1ee8c3c2dddcfc4d95ec69f0290f66ff8b2f47b3344522bd8a3649f120d4ca2"} Dec 16 07:31:42 crc kubenswrapper[4823]: I1216 07:31:42.515961 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerStarted","Data":"8e33d56ead0ce20e648ff97ac43a3731a95e0356f2bb26626abb655b59671c6a"} Dec 16 07:31:42 crc kubenswrapper[4823]: I1216 07:31:42.535266 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9dv6w" podStartSLOduration=3.087005597 podStartE2EDuration="5.53524679s" podCreationTimestamp="2025-12-16 07:31:37 +0000 UTC" firstStartedPulling="2025-12-16 07:31:39.484665183 +0000 UTC m=+2177.973231316" lastFinishedPulling="2025-12-16 07:31:41.932906386 +0000 UTC m=+2180.421472509" observedRunningTime="2025-12-16 07:31:42.533785174 +0000 UTC m=+2181.022351307" watchObservedRunningTime="2025-12-16 07:31:42.53524679 +0000 UTC m=+2181.023812913" Dec 16 07:31:48 crc kubenswrapper[4823]: I1216 07:31:48.056186 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:48 crc kubenswrapper[4823]: I1216 07:31:48.056602 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:48 crc kubenswrapper[4823]: I1216 07:31:48.210740 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:48 crc kubenswrapper[4823]: I1216 07:31:48.616463 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:48 crc kubenswrapper[4823]: I1216 07:31:48.675644 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dv6w"] Dec 16 07:31:50 crc kubenswrapper[4823]: I1216 07:31:50.584283 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9dv6w" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="registry-server" containerID="cri-o://8e33d56ead0ce20e648ff97ac43a3731a95e0356f2bb26626abb655b59671c6a" gracePeriod=2 Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.595243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerDied","Data":"8e33d56ead0ce20e648ff97ac43a3731a95e0356f2bb26626abb655b59671c6a"} Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.595226 4823 generic.go:334] "Generic (PLEG): container finished" podID="71303f90-3f56-4734-978e-9c4575332704" containerID="8e33d56ead0ce20e648ff97ac43a3731a95e0356f2bb26626abb655b59671c6a" exitCode=0 Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.595639 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dv6w" event={"ID":"71303f90-3f56-4734-978e-9c4575332704","Type":"ContainerDied","Data":"50a1d7810cd28a6a54eb3191e0ad47f6e0e20b39bc0a670ae2e1b4aebb19e8ff"} Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.595653 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a1d7810cd28a6a54eb3191e0ad47f6e0e20b39bc0a670ae2e1b4aebb19e8ff" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.624912 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.717078 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg9q9\" (UniqueName: \"kubernetes.io/projected/71303f90-3f56-4734-978e-9c4575332704-kube-api-access-xg9q9\") pod \"71303f90-3f56-4734-978e-9c4575332704\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.717131 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-utilities\") pod \"71303f90-3f56-4734-978e-9c4575332704\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.717184 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-catalog-content\") pod \"71303f90-3f56-4734-978e-9c4575332704\" (UID: \"71303f90-3f56-4734-978e-9c4575332704\") " Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.718686 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-utilities" (OuterVolumeSpecName: "utilities") pod "71303f90-3f56-4734-978e-9c4575332704" (UID: "71303f90-3f56-4734-978e-9c4575332704"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.725124 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71303f90-3f56-4734-978e-9c4575332704-kube-api-access-xg9q9" (OuterVolumeSpecName: "kube-api-access-xg9q9") pod "71303f90-3f56-4734-978e-9c4575332704" (UID: "71303f90-3f56-4734-978e-9c4575332704"). InnerVolumeSpecName "kube-api-access-xg9q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.740729 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71303f90-3f56-4734-978e-9c4575332704" (UID: "71303f90-3f56-4734-978e-9c4575332704"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.819693 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg9q9\" (UniqueName: \"kubernetes.io/projected/71303f90-3f56-4734-978e-9c4575332704-kube-api-access-xg9q9\") on node \"crc\" DevicePath \"\"" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.819759 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:31:51 crc kubenswrapper[4823]: I1216 07:31:51.819787 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71303f90-3f56-4734-978e-9c4575332704-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:31:52 crc kubenswrapper[4823]: I1216 07:31:52.604222 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dv6w" Dec 16 07:31:52 crc kubenswrapper[4823]: I1216 07:31:52.637756 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dv6w"] Dec 16 07:31:52 crc kubenswrapper[4823]: I1216 07:31:52.648314 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dv6w"] Dec 16 07:31:53 crc kubenswrapper[4823]: I1216 07:31:53.788322 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71303f90-3f56-4734-978e-9c4575332704" path="/var/lib/kubelet/pods/71303f90-3f56-4734-978e-9c4575332704/volumes" Dec 16 07:31:58 crc kubenswrapper[4823]: I1216 07:31:58.133522 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:31:58 crc kubenswrapper[4823]: I1216 07:31:58.134158 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.134274 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.134735 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.134770 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.135309 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.135356 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" gracePeriod=600 Dec 16 07:32:28 crc kubenswrapper[4823]: E1216 07:32:28.296149 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.884721 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" exitCode=0 Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.884798 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3"} Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.885090 4823 scope.go:117] "RemoveContainer" containerID="09163b4ab4e9994c64567f3f3aa8f5c17c63088f1a3e3778b97b8c712dcc34f7" Dec 16 07:32:28 crc kubenswrapper[4823]: I1216 07:32:28.885590 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:32:28 crc kubenswrapper[4823]: E1216 07:32:28.885817 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:32:41 crc kubenswrapper[4823]: I1216 07:32:41.775189 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:32:41 crc kubenswrapper[4823]: E1216 07:32:41.776055 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:32:53 crc kubenswrapper[4823]: I1216 07:32:53.771671 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:32:53 crc kubenswrapper[4823]: E1216 07:32:53.772628 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:33:04 crc kubenswrapper[4823]: I1216 07:33:04.772462 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:33:04 crc kubenswrapper[4823]: E1216 07:33:04.773592 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:33:17 crc kubenswrapper[4823]: I1216 07:33:17.772573 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:33:17 crc kubenswrapper[4823]: E1216 07:33:17.773434 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:33:28 crc kubenswrapper[4823]: I1216 07:33:28.772698 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:33:28 crc kubenswrapper[4823]: E1216 07:33:28.773783 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:33:41 crc kubenswrapper[4823]: I1216 07:33:41.779542 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:33:41 crc kubenswrapper[4823]: E1216 07:33:41.780362 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:33:53 crc kubenswrapper[4823]: I1216 07:33:53.771702 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:33:53 crc kubenswrapper[4823]: E1216 07:33:53.772635 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:34:06 crc kubenswrapper[4823]: I1216 07:34:06.772756 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:34:06 crc kubenswrapper[4823]: E1216 07:34:06.773609 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:34:20 crc kubenswrapper[4823]: I1216 07:34:20.771323 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:34:20 crc kubenswrapper[4823]: E1216 07:34:20.772089 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:34:31 crc kubenswrapper[4823]: I1216 07:34:31.781326 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:34:31 crc kubenswrapper[4823]: E1216 07:34:31.782085 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:34:44 crc kubenswrapper[4823]: I1216 07:34:44.771518 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:34:44 crc kubenswrapper[4823]: E1216 07:34:44.772421 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.172940 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fb624"] Dec 16 07:34:48 crc kubenswrapper[4823]: E1216 07:34:48.174305 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="registry-server" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.174376 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="registry-server" Dec 16 07:34:48 crc kubenswrapper[4823]: E1216 07:34:48.174419 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="extract-content" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.174478 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="extract-content" Dec 16 07:34:48 crc kubenswrapper[4823]: E1216 07:34:48.174506 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="extract-utilities" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.174521 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="extract-utilities" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.174912 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="71303f90-3f56-4734-978e-9c4575332704" containerName="registry-server" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.181447 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.185587 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fb624"] Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.232713 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639299af-31e8-4fbc-9b06-5b45178ab1e1-catalog-content\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.233142 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639299af-31e8-4fbc-9b06-5b45178ab1e1-utilities\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.233190 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmbm8\" (UniqueName: \"kubernetes.io/projected/639299af-31e8-4fbc-9b06-5b45178ab1e1-kube-api-access-wmbm8\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.334622 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639299af-31e8-4fbc-9b06-5b45178ab1e1-catalog-content\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.334720 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639299af-31e8-4fbc-9b06-5b45178ab1e1-utilities\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.334767 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmbm8\" (UniqueName: \"kubernetes.io/projected/639299af-31e8-4fbc-9b06-5b45178ab1e1-kube-api-access-wmbm8\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.335328 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/639299af-31e8-4fbc-9b06-5b45178ab1e1-catalog-content\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.335356 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/639299af-31e8-4fbc-9b06-5b45178ab1e1-utilities\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.366560 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmbm8\" (UniqueName: \"kubernetes.io/projected/639299af-31e8-4fbc-9b06-5b45178ab1e1-kube-api-access-wmbm8\") pod \"community-operators-fb624\" (UID: \"639299af-31e8-4fbc-9b06-5b45178ab1e1\") " pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:48 crc kubenswrapper[4823]: I1216 07:34:48.528791 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:49 crc kubenswrapper[4823]: I1216 07:34:49.031592 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fb624"] Dec 16 07:34:49 crc kubenswrapper[4823]: I1216 07:34:49.985641 4823 generic.go:334] "Generic (PLEG): container finished" podID="639299af-31e8-4fbc-9b06-5b45178ab1e1" containerID="b3b26297e3b34720da3882a52b986a16048ff88dd54748ad206f86574e627d00" exitCode=0 Dec 16 07:34:49 crc kubenswrapper[4823]: I1216 07:34:49.985706 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb624" event={"ID":"639299af-31e8-4fbc-9b06-5b45178ab1e1","Type":"ContainerDied","Data":"b3b26297e3b34720da3882a52b986a16048ff88dd54748ad206f86574e627d00"} Dec 16 07:34:49 crc kubenswrapper[4823]: I1216 07:34:49.985928 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb624" event={"ID":"639299af-31e8-4fbc-9b06-5b45178ab1e1","Type":"ContainerStarted","Data":"345ef60336a862366a95be6565dad05c0a2202919263a17204da9d97d2697dd2"} Dec 16 07:34:49 crc kubenswrapper[4823]: I1216 07:34:49.987704 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:34:54 crc kubenswrapper[4823]: I1216 07:34:54.020792 4823 generic.go:334] "Generic (PLEG): container finished" podID="639299af-31e8-4fbc-9b06-5b45178ab1e1" containerID="abe347046632215a7937c7e18f19562f1b9e2c1af0eada1ba8358e41b3dfa643" exitCode=0 Dec 16 07:34:54 crc kubenswrapper[4823]: I1216 07:34:54.020861 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb624" event={"ID":"639299af-31e8-4fbc-9b06-5b45178ab1e1","Type":"ContainerDied","Data":"abe347046632215a7937c7e18f19562f1b9e2c1af0eada1ba8358e41b3dfa643"} Dec 16 07:34:55 crc kubenswrapper[4823]: I1216 07:34:55.772363 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:34:55 crc kubenswrapper[4823]: E1216 07:34:55.773226 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:34:56 crc kubenswrapper[4823]: I1216 07:34:56.036418 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb624" event={"ID":"639299af-31e8-4fbc-9b06-5b45178ab1e1","Type":"ContainerStarted","Data":"70dbef58f83b64fba45ee64e7ca511c8eca35128c4a7424f018919a38c329e56"} Dec 16 07:34:56 crc kubenswrapper[4823]: I1216 07:34:56.063759 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fb624" podStartSLOduration=2.6838107620000002 podStartE2EDuration="8.063737165s" podCreationTimestamp="2025-12-16 07:34:48 +0000 UTC" firstStartedPulling="2025-12-16 07:34:49.98746896 +0000 UTC m=+2368.476035083" lastFinishedPulling="2025-12-16 07:34:55.367395323 +0000 UTC m=+2373.855961486" observedRunningTime="2025-12-16 07:34:56.057891682 +0000 UTC m=+2374.546457845" watchObservedRunningTime="2025-12-16 07:34:56.063737165 +0000 UTC m=+2374.552303298" Dec 16 07:34:58 crc kubenswrapper[4823]: I1216 07:34:58.529576 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:58 crc kubenswrapper[4823]: I1216 07:34:58.530006 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fb624" Dec 16 07:34:58 crc kubenswrapper[4823]: I1216 07:34:58.602155 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fb624" Dec 16 07:35:00 crc kubenswrapper[4823]: I1216 07:35:00.142517 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fb624" Dec 16 07:35:00 crc kubenswrapper[4823]: I1216 07:35:00.235612 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fb624"] Dec 16 07:35:00 crc kubenswrapper[4823]: I1216 07:35:00.300816 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4f255"] Dec 16 07:35:00 crc kubenswrapper[4823]: I1216 07:35:00.301377 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4f255" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="registry-server" containerID="cri-o://350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c" gracePeriod=2 Dec 16 07:35:00 crc kubenswrapper[4823]: E1216 07:35:00.542042 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c is running failed: container process not found" containerID="350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:35:00 crc kubenswrapper[4823]: E1216 07:35:00.542542 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c is running failed: container process not found" containerID="350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:35:00 crc kubenswrapper[4823]: E1216 07:35:00.543038 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c is running failed: container process not found" containerID="350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:35:00 crc kubenswrapper[4823]: E1216 07:35:00.543109 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-4f255" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="registry-server" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.100188 4823 generic.go:334] "Generic (PLEG): container finished" podID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerID="350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c" exitCode=0 Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.100315 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f255" event={"ID":"af52958a-a702-46cf-b108-0d6f3227d7f5","Type":"ContainerDied","Data":"350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c"} Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.620902 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.758281 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-catalog-content\") pod \"af52958a-a702-46cf-b108-0d6f3227d7f5\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.758531 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-utilities\") pod \"af52958a-a702-46cf-b108-0d6f3227d7f5\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.758584 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27t9g\" (UniqueName: \"kubernetes.io/projected/af52958a-a702-46cf-b108-0d6f3227d7f5-kube-api-access-27t9g\") pod \"af52958a-a702-46cf-b108-0d6f3227d7f5\" (UID: \"af52958a-a702-46cf-b108-0d6f3227d7f5\") " Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.759514 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-utilities" (OuterVolumeSpecName: "utilities") pod "af52958a-a702-46cf-b108-0d6f3227d7f5" (UID: "af52958a-a702-46cf-b108-0d6f3227d7f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.768621 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af52958a-a702-46cf-b108-0d6f3227d7f5-kube-api-access-27t9g" (OuterVolumeSpecName: "kube-api-access-27t9g") pod "af52958a-a702-46cf-b108-0d6f3227d7f5" (UID: "af52958a-a702-46cf-b108-0d6f3227d7f5"). InnerVolumeSpecName "kube-api-access-27t9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.821017 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af52958a-a702-46cf-b108-0d6f3227d7f5" (UID: "af52958a-a702-46cf-b108-0d6f3227d7f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.862199 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.862251 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af52958a-a702-46cf-b108-0d6f3227d7f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:35:02 crc kubenswrapper[4823]: I1216 07:35:02.862270 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27t9g\" (UniqueName: \"kubernetes.io/projected/af52958a-a702-46cf-b108-0d6f3227d7f5-kube-api-access-27t9g\") on node \"crc\" DevicePath \"\"" Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.119322 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f255" event={"ID":"af52958a-a702-46cf-b108-0d6f3227d7f5","Type":"ContainerDied","Data":"ed18e7986e45a46db8ff1b975fa4253c553e091212ecbbff70730221a8927f9b"} Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.119454 4823 scope.go:117] "RemoveContainer" containerID="350f5db595cd158265896244b10201e5df3b0e55245ead8eb062c0a3146bc51c" Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.119575 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f255" Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.161419 4823 scope.go:117] "RemoveContainer" containerID="4770362034bafdeedc02490ef46fcaf61f520c1a895229ec8bf02a2d742d0951" Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.182486 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4f255"] Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.191085 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4f255"] Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.204210 4823 scope.go:117] "RemoveContainer" containerID="cc92b6c7602edd8095f5057c411e2bf5a4368647e2660dc2b4ae8c911a811d65" Dec 16 07:35:03 crc kubenswrapper[4823]: I1216 07:35:03.787220 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" path="/var/lib/kubelet/pods/af52958a-a702-46cf-b108-0d6f3227d7f5/volumes" Dec 16 07:35:08 crc kubenswrapper[4823]: I1216 07:35:08.772545 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:35:08 crc kubenswrapper[4823]: E1216 07:35:08.773287 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:35:21 crc kubenswrapper[4823]: I1216 07:35:21.788522 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:35:21 crc kubenswrapper[4823]: E1216 07:35:21.789992 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:35:32 crc kubenswrapper[4823]: I1216 07:35:32.771369 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:35:32 crc kubenswrapper[4823]: E1216 07:35:32.772048 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:35:45 crc kubenswrapper[4823]: I1216 07:35:45.772540 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:35:45 crc kubenswrapper[4823]: E1216 07:35:45.773361 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:35:58 crc kubenswrapper[4823]: I1216 07:35:58.772357 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:35:58 crc kubenswrapper[4823]: E1216 07:35:58.773290 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:36:12 crc kubenswrapper[4823]: I1216 07:36:12.772327 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:36:12 crc kubenswrapper[4823]: E1216 07:36:12.773020 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:36:24 crc kubenswrapper[4823]: I1216 07:36:24.771810 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:36:24 crc kubenswrapper[4823]: E1216 07:36:24.772620 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:36:39 crc kubenswrapper[4823]: I1216 07:36:39.771525 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:36:39 crc kubenswrapper[4823]: E1216 07:36:39.772257 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:36:40 crc kubenswrapper[4823]: I1216 07:36:40.354626 4823 scope.go:117] "RemoveContainer" containerID="e0521b27943514a2a13223fd98fecf7885653b35b80cf52b890467c6cf14612a" Dec 16 07:36:40 crc kubenswrapper[4823]: I1216 07:36:40.373877 4823 scope.go:117] "RemoveContainer" containerID="fb0bec15cddbcd02e1e9228d8e37a0615943f7a506f936b83e7b40bf41eacb80" Dec 16 07:36:40 crc kubenswrapper[4823]: I1216 07:36:40.406909 4823 scope.go:117] "RemoveContainer" containerID="94f151dac0a32d13d4eb828bf33c3d055b99be4e8d6ec46b75a5f6377f9bf082" Dec 16 07:36:51 crc kubenswrapper[4823]: I1216 07:36:51.775726 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:36:51 crc kubenswrapper[4823]: E1216 07:36:51.776538 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:37:06 crc kubenswrapper[4823]: I1216 07:37:06.771606 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:37:06 crc kubenswrapper[4823]: E1216 07:37:06.772362 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:37:21 crc kubenswrapper[4823]: I1216 07:37:21.776670 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:37:21 crc kubenswrapper[4823]: E1216 07:37:21.777430 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:37:35 crc kubenswrapper[4823]: I1216 07:37:35.771636 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:37:36 crc kubenswrapper[4823]: I1216 07:37:36.341364 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"c834dad71dd152fdffca68bf2b931bcb63c99e897f038cd6ad832b07707115a5"} Dec 16 07:37:40 crc kubenswrapper[4823]: I1216 07:37:40.453686 4823 scope.go:117] "RemoveContainer" containerID="f390b53e98e8d2f210b32c3e34b8840457028803768337979080cfd698c49b5d" Dec 16 07:37:40 crc kubenswrapper[4823]: I1216 07:37:40.484570 4823 scope.go:117] "RemoveContainer" containerID="a1ee8c3c2dddcfc4d95ec69f0290f66ff8b2f47b3344522bd8a3649f120d4ca2" Dec 16 07:38:40 crc kubenswrapper[4823]: I1216 07:38:40.519656 4823 scope.go:117] "RemoveContainer" containerID="8e33d56ead0ce20e648ff97ac43a3731a95e0356f2bb26626abb655b59671c6a" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.224956 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbm26"] Dec 16 07:39:54 crc kubenswrapper[4823]: E1216 07:39:54.225788 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="extract-content" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.225805 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="extract-content" Dec 16 07:39:54 crc kubenswrapper[4823]: E1216 07:39:54.225824 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="extract-utilities" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.225831 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="extract-utilities" Dec 16 07:39:54 crc kubenswrapper[4823]: E1216 07:39:54.225846 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="registry-server" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.225867 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="registry-server" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.226057 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="af52958a-a702-46cf-b108-0d6f3227d7f5" containerName="registry-server" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.227011 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.235371 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbm26"] Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.377686 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-utilities\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.377731 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-catalog-content\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.377779 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2btb5\" (UniqueName: \"kubernetes.io/projected/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-kube-api-access-2btb5\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.479397 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-utilities\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.479451 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-catalog-content\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.479518 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2btb5\" (UniqueName: \"kubernetes.io/projected/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-kube-api-access-2btb5\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.480106 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-utilities\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.480118 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-catalog-content\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.508992 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2btb5\" (UniqueName: \"kubernetes.io/projected/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-kube-api-access-2btb5\") pod \"redhat-operators-qbm26\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:54 crc kubenswrapper[4823]: I1216 07:39:54.586331 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:39:55 crc kubenswrapper[4823]: I1216 07:39:55.028574 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbm26"] Dec 16 07:39:55 crc kubenswrapper[4823]: I1216 07:39:55.600381 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerID="d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe" exitCode=0 Dec 16 07:39:55 crc kubenswrapper[4823]: I1216 07:39:55.600527 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbm26" event={"ID":"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148","Type":"ContainerDied","Data":"d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe"} Dec 16 07:39:55 crc kubenswrapper[4823]: I1216 07:39:55.600763 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbm26" event={"ID":"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148","Type":"ContainerStarted","Data":"2fb8cdd621115dd20a054aca3e240ac74ca2f80c9067229fa20074962872e0fe"} Dec 16 07:39:55 crc kubenswrapper[4823]: I1216 07:39:55.602345 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:39:57 crc kubenswrapper[4823]: I1216 07:39:57.615200 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerID="a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0" exitCode=0 Dec 16 07:39:57 crc kubenswrapper[4823]: I1216 07:39:57.615325 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbm26" event={"ID":"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148","Type":"ContainerDied","Data":"a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0"} Dec 16 07:39:58 crc kubenswrapper[4823]: I1216 07:39:58.134294 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:39:58 crc kubenswrapper[4823]: I1216 07:39:58.134375 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:39:58 crc kubenswrapper[4823]: I1216 07:39:58.623901 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbm26" event={"ID":"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148","Type":"ContainerStarted","Data":"6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810"} Dec 16 07:39:58 crc kubenswrapper[4823]: I1216 07:39:58.640131 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbm26" podStartSLOduration=2.15851338 podStartE2EDuration="4.64011487s" podCreationTimestamp="2025-12-16 07:39:54 +0000 UTC" firstStartedPulling="2025-12-16 07:39:55.602007012 +0000 UTC m=+2674.090573135" lastFinishedPulling="2025-12-16 07:39:58.083608502 +0000 UTC m=+2676.572174625" observedRunningTime="2025-12-16 07:39:58.638494679 +0000 UTC m=+2677.127060802" watchObservedRunningTime="2025-12-16 07:39:58.64011487 +0000 UTC m=+2677.128680983" Dec 16 07:40:04 crc kubenswrapper[4823]: I1216 07:40:04.586722 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:40:04 crc kubenswrapper[4823]: I1216 07:40:04.587736 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:40:04 crc kubenswrapper[4823]: I1216 07:40:04.639111 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:40:04 crc kubenswrapper[4823]: I1216 07:40:04.712230 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:40:05 crc kubenswrapper[4823]: I1216 07:40:05.416010 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbm26"] Dec 16 07:40:06 crc kubenswrapper[4823]: I1216 07:40:06.683990 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qbm26" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="registry-server" containerID="cri-o://6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810" gracePeriod=2 Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.109798 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.211258 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2btb5\" (UniqueName: \"kubernetes.io/projected/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-kube-api-access-2btb5\") pod \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.211359 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-utilities\") pod \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.211498 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-catalog-content\") pod \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\" (UID: \"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148\") " Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.212492 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-utilities" (OuterVolumeSpecName: "utilities") pod "9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" (UID: "9ea9935c-2de0-4b33-8ccf-46c9b7b2b148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.221253 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-kube-api-access-2btb5" (OuterVolumeSpecName: "kube-api-access-2btb5") pod "9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" (UID: "9ea9935c-2de0-4b33-8ccf-46c9b7b2b148"). InnerVolumeSpecName "kube-api-access-2btb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.312908 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2btb5\" (UniqueName: \"kubernetes.io/projected/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-kube-api-access-2btb5\") on node \"crc\" DevicePath \"\"" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.312953 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.693786 4823 generic.go:334] "Generic (PLEG): container finished" podID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerID="6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810" exitCode=0 Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.693829 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbm26" event={"ID":"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148","Type":"ContainerDied","Data":"6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810"} Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.693864 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbm26" event={"ID":"9ea9935c-2de0-4b33-8ccf-46c9b7b2b148","Type":"ContainerDied","Data":"2fb8cdd621115dd20a054aca3e240ac74ca2f80c9067229fa20074962872e0fe"} Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.693892 4823 scope.go:117] "RemoveContainer" containerID="6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.693901 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbm26" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.715974 4823 scope.go:117] "RemoveContainer" containerID="a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.740622 4823 scope.go:117] "RemoveContainer" containerID="d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.769317 4823 scope.go:117] "RemoveContainer" containerID="6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810" Dec 16 07:40:07 crc kubenswrapper[4823]: E1216 07:40:07.770058 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810\": container with ID starting with 6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810 not found: ID does not exist" containerID="6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.770173 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810"} err="failed to get container status \"6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810\": rpc error: code = NotFound desc = could not find container \"6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810\": container with ID starting with 6e586e55e0da471789da0d8e17c985b7f617b904097e2deaa494327d4fe2a810 not found: ID does not exist" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.770226 4823 scope.go:117] "RemoveContainer" containerID="a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0" Dec 16 07:40:07 crc kubenswrapper[4823]: E1216 07:40:07.770763 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0\": container with ID starting with a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0 not found: ID does not exist" containerID="a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.770831 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0"} err="failed to get container status \"a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0\": rpc error: code = NotFound desc = could not find container \"a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0\": container with ID starting with a283c30d97cac5bf4b5c056b4da4ddc2f5f138b58ed3374f1594e31f04ccc4f0 not found: ID does not exist" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.770875 4823 scope.go:117] "RemoveContainer" containerID="d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe" Dec 16 07:40:07 crc kubenswrapper[4823]: E1216 07:40:07.771277 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe\": container with ID starting with d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe not found: ID does not exist" containerID="d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe" Dec 16 07:40:07 crc kubenswrapper[4823]: I1216 07:40:07.771303 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe"} err="failed to get container status \"d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe\": rpc error: code = NotFound desc = could not find container \"d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe\": container with ID starting with d87fbc8d54a846f63de41002768b5818352c0cf04a474ac66175f7b6898dbebe not found: ID does not exist" Dec 16 07:40:09 crc kubenswrapper[4823]: I1216 07:40:09.933237 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" (UID: "9ea9935c-2de0-4b33-8ccf-46c9b7b2b148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:40:09 crc kubenswrapper[4823]: I1216 07:40:09.947925 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:40:10 crc kubenswrapper[4823]: I1216 07:40:10.125224 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbm26"] Dec 16 07:40:10 crc kubenswrapper[4823]: I1216 07:40:10.130047 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qbm26"] Dec 16 07:40:11 crc kubenswrapper[4823]: I1216 07:40:11.787805 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" path="/var/lib/kubelet/pods/9ea9935c-2de0-4b33-8ccf-46c9b7b2b148/volumes" Dec 16 07:40:28 crc kubenswrapper[4823]: I1216 07:40:28.134285 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:40:28 crc kubenswrapper[4823]: I1216 07:40:28.135438 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.774451 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mj7bp"] Dec 16 07:40:33 crc kubenswrapper[4823]: E1216 07:40:33.775165 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="extract-utilities" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.775179 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="extract-utilities" Dec 16 07:40:33 crc kubenswrapper[4823]: E1216 07:40:33.775194 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="extract-content" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.775201 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="extract-content" Dec 16 07:40:33 crc kubenswrapper[4823]: E1216 07:40:33.775220 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="registry-server" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.775230 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="registry-server" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.775401 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea9935c-2de0-4b33-8ccf-46c9b7b2b148" containerName="registry-server" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.776632 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.795301 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mj7bp"] Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.821353 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlq7h\" (UniqueName: \"kubernetes.io/projected/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-kube-api-access-dlq7h\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.821651 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-catalog-content\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.821758 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-utilities\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.922758 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-utilities\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.922860 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlq7h\" (UniqueName: \"kubernetes.io/projected/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-kube-api-access-dlq7h\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.922883 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-catalog-content\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.923319 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-utilities\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.923349 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-catalog-content\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:33 crc kubenswrapper[4823]: I1216 07:40:33.953294 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlq7h\" (UniqueName: \"kubernetes.io/projected/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-kube-api-access-dlq7h\") pod \"certified-operators-mj7bp\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:34 crc kubenswrapper[4823]: I1216 07:40:34.098568 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:34 crc kubenswrapper[4823]: I1216 07:40:34.596214 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mj7bp"] Dec 16 07:40:34 crc kubenswrapper[4823]: I1216 07:40:34.892892 4823 generic.go:334] "Generic (PLEG): container finished" podID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerID="13deec4d717db522c4273b328a95bc32a62f361c1e55e20c873d53a541a5e1e1" exitCode=0 Dec 16 07:40:34 crc kubenswrapper[4823]: I1216 07:40:34.892980 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerDied","Data":"13deec4d717db522c4273b328a95bc32a62f361c1e55e20c873d53a541a5e1e1"} Dec 16 07:40:34 crc kubenswrapper[4823]: I1216 07:40:34.894257 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerStarted","Data":"730164d48dbe22b760891ec9fb78adaba482bce68b0e52f9905443c56aad8aac"} Dec 16 07:40:35 crc kubenswrapper[4823]: I1216 07:40:35.910600 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerStarted","Data":"f850b5badc20188f268fa48c47e469cf326cd645755c00e979d3a225d1ccaafd"} Dec 16 07:40:36 crc kubenswrapper[4823]: I1216 07:40:36.924323 4823 generic.go:334] "Generic (PLEG): container finished" podID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerID="f850b5badc20188f268fa48c47e469cf326cd645755c00e979d3a225d1ccaafd" exitCode=0 Dec 16 07:40:36 crc kubenswrapper[4823]: I1216 07:40:36.924375 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerDied","Data":"f850b5badc20188f268fa48c47e469cf326cd645755c00e979d3a225d1ccaafd"} Dec 16 07:40:37 crc kubenswrapper[4823]: I1216 07:40:37.939593 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerStarted","Data":"4c4afcf9ed55c4084248a562951af0104e07cca18bb5e172e6db1df76ba9ae03"} Dec 16 07:40:37 crc kubenswrapper[4823]: I1216 07:40:37.964554 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mj7bp" podStartSLOduration=2.265189731 podStartE2EDuration="4.964532528s" podCreationTimestamp="2025-12-16 07:40:33 +0000 UTC" firstStartedPulling="2025-12-16 07:40:34.895458439 +0000 UTC m=+2713.384024562" lastFinishedPulling="2025-12-16 07:40:37.594801236 +0000 UTC m=+2716.083367359" observedRunningTime="2025-12-16 07:40:37.958964383 +0000 UTC m=+2716.447530506" watchObservedRunningTime="2025-12-16 07:40:37.964532528 +0000 UTC m=+2716.453098651" Dec 16 07:40:44 crc kubenswrapper[4823]: I1216 07:40:44.098739 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:44 crc kubenswrapper[4823]: I1216 07:40:44.099327 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:44 crc kubenswrapper[4823]: I1216 07:40:44.150271 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:45 crc kubenswrapper[4823]: I1216 07:40:45.028040 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:46 crc kubenswrapper[4823]: I1216 07:40:46.788224 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mj7bp"] Dec 16 07:40:46 crc kubenswrapper[4823]: I1216 07:40:46.997983 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mj7bp" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="registry-server" containerID="cri-o://4c4afcf9ed55c4084248a562951af0104e07cca18bb5e172e6db1df76ba9ae03" gracePeriod=2 Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.012481 4823 generic.go:334] "Generic (PLEG): container finished" podID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerID="4c4afcf9ed55c4084248a562951af0104e07cca18bb5e172e6db1df76ba9ae03" exitCode=0 Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.012828 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerDied","Data":"4c4afcf9ed55c4084248a562951af0104e07cca18bb5e172e6db1df76ba9ae03"} Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.129436 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.142701 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-utilities\") pod \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.142790 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlq7h\" (UniqueName: \"kubernetes.io/projected/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-kube-api-access-dlq7h\") pod \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.142842 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-catalog-content\") pod \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\" (UID: \"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4\") " Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.143682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-utilities" (OuterVolumeSpecName: "utilities") pod "40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" (UID: "40b6446c-9fa2-4b03-b0a9-7925f7eb92b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.157322 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-kube-api-access-dlq7h" (OuterVolumeSpecName: "kube-api-access-dlq7h") pod "40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" (UID: "40b6446c-9fa2-4b03-b0a9-7925f7eb92b4"). InnerVolumeSpecName "kube-api-access-dlq7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.210855 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" (UID: "40b6446c-9fa2-4b03-b0a9-7925f7eb92b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.244844 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.244936 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlq7h\" (UniqueName: \"kubernetes.io/projected/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-kube-api-access-dlq7h\") on node \"crc\" DevicePath \"\"" Dec 16 07:40:48 crc kubenswrapper[4823]: I1216 07:40:48.244956 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.021703 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj7bp" event={"ID":"40b6446c-9fa2-4b03-b0a9-7925f7eb92b4","Type":"ContainerDied","Data":"730164d48dbe22b760891ec9fb78adaba482bce68b0e52f9905443c56aad8aac"} Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.021770 4823 scope.go:117] "RemoveContainer" containerID="4c4afcf9ed55c4084248a562951af0104e07cca18bb5e172e6db1df76ba9ae03" Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.021783 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj7bp" Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.040123 4823 scope.go:117] "RemoveContainer" containerID="f850b5badc20188f268fa48c47e469cf326cd645755c00e979d3a225d1ccaafd" Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.053068 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mj7bp"] Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.059111 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mj7bp"] Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.065913 4823 scope.go:117] "RemoveContainer" containerID="13deec4d717db522c4273b328a95bc32a62f361c1e55e20c873d53a541a5e1e1" Dec 16 07:40:49 crc kubenswrapper[4823]: I1216 07:40:49.782781 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" path="/var/lib/kubelet/pods/40b6446c-9fa2-4b03-b0a9-7925f7eb92b4/volumes" Dec 16 07:40:58 crc kubenswrapper[4823]: I1216 07:40:58.134935 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:40:58 crc kubenswrapper[4823]: I1216 07:40:58.135567 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:40:58 crc kubenswrapper[4823]: I1216 07:40:58.135621 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:40:58 crc kubenswrapper[4823]: I1216 07:40:58.136343 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c834dad71dd152fdffca68bf2b931bcb63c99e897f038cd6ad832b07707115a5"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:40:58 crc kubenswrapper[4823]: I1216 07:40:58.136410 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://c834dad71dd152fdffca68bf2b931bcb63c99e897f038cd6ad832b07707115a5" gracePeriod=600 Dec 16 07:40:59 crc kubenswrapper[4823]: I1216 07:40:59.094132 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="c834dad71dd152fdffca68bf2b931bcb63c99e897f038cd6ad832b07707115a5" exitCode=0 Dec 16 07:40:59 crc kubenswrapper[4823]: I1216 07:40:59.094206 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"c834dad71dd152fdffca68bf2b931bcb63c99e897f038cd6ad832b07707115a5"} Dec 16 07:40:59 crc kubenswrapper[4823]: I1216 07:40:59.094680 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7"} Dec 16 07:40:59 crc kubenswrapper[4823]: I1216 07:40:59.094707 4823 scope.go:117] "RemoveContainer" containerID="9050a18beacbae3340b69e617e1ef6c1a7391ea7818aeeb48fc6cc970d45d0d3" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.933539 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkgvs"] Dec 16 07:42:06 crc kubenswrapper[4823]: E1216 07:42:06.934503 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="registry-server" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.934520 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="registry-server" Dec 16 07:42:06 crc kubenswrapper[4823]: E1216 07:42:06.934535 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="extract-utilities" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.934543 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="extract-utilities" Dec 16 07:42:06 crc kubenswrapper[4823]: E1216 07:42:06.934572 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="extract-content" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.934582 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="extract-content" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.934752 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b6446c-9fa2-4b03-b0a9-7925f7eb92b4" containerName="registry-server" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.936136 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:06 crc kubenswrapper[4823]: I1216 07:42:06.943999 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkgvs"] Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.055344 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfq6t\" (UniqueName: \"kubernetes.io/projected/d11b4462-a27e-46d6-9b58-1e30042b53e5-kube-api-access-zfq6t\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.055667 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-utilities\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.055765 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-catalog-content\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.157562 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfq6t\" (UniqueName: \"kubernetes.io/projected/d11b4462-a27e-46d6-9b58-1e30042b53e5-kube-api-access-zfq6t\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.157663 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-utilities\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.157689 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-catalog-content\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.158277 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-catalog-content\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.158476 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-utilities\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.182886 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfq6t\" (UniqueName: \"kubernetes.io/projected/d11b4462-a27e-46d6-9b58-1e30042b53e5-kube-api-access-zfq6t\") pod \"redhat-marketplace-mkgvs\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.261633 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:07 crc kubenswrapper[4823]: I1216 07:42:07.746514 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkgvs"] Dec 16 07:42:08 crc kubenswrapper[4823]: I1216 07:42:08.635257 4823 generic.go:334] "Generic (PLEG): container finished" podID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerID="0a98f9b3f661b0cdd3cb6ecefc16d938d51440c09d3442bd4638d27248732716" exitCode=0 Dec 16 07:42:08 crc kubenswrapper[4823]: I1216 07:42:08.635375 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkgvs" event={"ID":"d11b4462-a27e-46d6-9b58-1e30042b53e5","Type":"ContainerDied","Data":"0a98f9b3f661b0cdd3cb6ecefc16d938d51440c09d3442bd4638d27248732716"} Dec 16 07:42:08 crc kubenswrapper[4823]: I1216 07:42:08.635532 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkgvs" event={"ID":"d11b4462-a27e-46d6-9b58-1e30042b53e5","Type":"ContainerStarted","Data":"4d524069fa340ff0e534bf4912d1e0df0a03344ed8aa8d2a48c28be8338ca9b0"} Dec 16 07:42:10 crc kubenswrapper[4823]: I1216 07:42:10.671377 4823 generic.go:334] "Generic (PLEG): container finished" podID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerID="cb080b9fe1b7a27926b2eefcebae0fbe4337716c31951112d937497fd7ad96c4" exitCode=0 Dec 16 07:42:10 crc kubenswrapper[4823]: I1216 07:42:10.671732 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkgvs" event={"ID":"d11b4462-a27e-46d6-9b58-1e30042b53e5","Type":"ContainerDied","Data":"cb080b9fe1b7a27926b2eefcebae0fbe4337716c31951112d937497fd7ad96c4"} Dec 16 07:42:11 crc kubenswrapper[4823]: I1216 07:42:11.683511 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkgvs" event={"ID":"d11b4462-a27e-46d6-9b58-1e30042b53e5","Type":"ContainerStarted","Data":"bd3265a732fb28b1023952dea4a96a85a979e97c1e86c4a67c7b183c0b0b6baa"} Dec 16 07:42:11 crc kubenswrapper[4823]: I1216 07:42:11.707856 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkgvs" podStartSLOduration=3.208326929 podStartE2EDuration="5.707837821s" podCreationTimestamp="2025-12-16 07:42:06 +0000 UTC" firstStartedPulling="2025-12-16 07:42:08.636841902 +0000 UTC m=+2807.125408025" lastFinishedPulling="2025-12-16 07:42:11.136352794 +0000 UTC m=+2809.624918917" observedRunningTime="2025-12-16 07:42:11.702508274 +0000 UTC m=+2810.191074417" watchObservedRunningTime="2025-12-16 07:42:11.707837821 +0000 UTC m=+2810.196403944" Dec 16 07:42:17 crc kubenswrapper[4823]: I1216 07:42:17.262602 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:17 crc kubenswrapper[4823]: I1216 07:42:17.263231 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:17 crc kubenswrapper[4823]: I1216 07:42:17.306332 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:17 crc kubenswrapper[4823]: I1216 07:42:17.784925 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:17 crc kubenswrapper[4823]: I1216 07:42:17.843413 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkgvs"] Dec 16 07:42:19 crc kubenswrapper[4823]: I1216 07:42:19.741666 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkgvs" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="registry-server" containerID="cri-o://bd3265a732fb28b1023952dea4a96a85a979e97c1e86c4a67c7b183c0b0b6baa" gracePeriod=2 Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.762961 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkgvs" event={"ID":"d11b4462-a27e-46d6-9b58-1e30042b53e5","Type":"ContainerDied","Data":"bd3265a732fb28b1023952dea4a96a85a979e97c1e86c4a67c7b183c0b0b6baa"} Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.762909 4823 generic.go:334] "Generic (PLEG): container finished" podID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerID="bd3265a732fb28b1023952dea4a96a85a979e97c1e86c4a67c7b183c0b0b6baa" exitCode=0 Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.825888 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.955282 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-utilities\") pod \"d11b4462-a27e-46d6-9b58-1e30042b53e5\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.955425 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfq6t\" (UniqueName: \"kubernetes.io/projected/d11b4462-a27e-46d6-9b58-1e30042b53e5-kube-api-access-zfq6t\") pod \"d11b4462-a27e-46d6-9b58-1e30042b53e5\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.955469 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-catalog-content\") pod \"d11b4462-a27e-46d6-9b58-1e30042b53e5\" (UID: \"d11b4462-a27e-46d6-9b58-1e30042b53e5\") " Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.956463 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-utilities" (OuterVolumeSpecName: "utilities") pod "d11b4462-a27e-46d6-9b58-1e30042b53e5" (UID: "d11b4462-a27e-46d6-9b58-1e30042b53e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.961376 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11b4462-a27e-46d6-9b58-1e30042b53e5-kube-api-access-zfq6t" (OuterVolumeSpecName: "kube-api-access-zfq6t") pod "d11b4462-a27e-46d6-9b58-1e30042b53e5" (UID: "d11b4462-a27e-46d6-9b58-1e30042b53e5"). InnerVolumeSpecName "kube-api-access-zfq6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:42:20 crc kubenswrapper[4823]: I1216 07:42:20.976619 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d11b4462-a27e-46d6-9b58-1e30042b53e5" (UID: "d11b4462-a27e-46d6-9b58-1e30042b53e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.057024 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.057091 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfq6t\" (UniqueName: \"kubernetes.io/projected/d11b4462-a27e-46d6-9b58-1e30042b53e5-kube-api-access-zfq6t\") on node \"crc\" DevicePath \"\"" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.057107 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b4462-a27e-46d6-9b58-1e30042b53e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.778141 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkgvs" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.783603 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkgvs" event={"ID":"d11b4462-a27e-46d6-9b58-1e30042b53e5","Type":"ContainerDied","Data":"4d524069fa340ff0e534bf4912d1e0df0a03344ed8aa8d2a48c28be8338ca9b0"} Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.783650 4823 scope.go:117] "RemoveContainer" containerID="bd3265a732fb28b1023952dea4a96a85a979e97c1e86c4a67c7b183c0b0b6baa" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.812114 4823 scope.go:117] "RemoveContainer" containerID="cb080b9fe1b7a27926b2eefcebae0fbe4337716c31951112d937497fd7ad96c4" Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.818223 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkgvs"] Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.830483 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkgvs"] Dec 16 07:42:21 crc kubenswrapper[4823]: I1216 07:42:21.837800 4823 scope.go:117] "RemoveContainer" containerID="0a98f9b3f661b0cdd3cb6ecefc16d938d51440c09d3442bd4638d27248732716" Dec 16 07:42:23 crc kubenswrapper[4823]: I1216 07:42:23.782093 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" path="/var/lib/kubelet/pods/d11b4462-a27e-46d6-9b58-1e30042b53e5/volumes" Dec 16 07:42:29 crc kubenswrapper[4823]: E1216 07:42:29.843009 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b4462_a27e_46d6_9b58_1e30042b53e5.slice\": RecentStats: unable to find data in memory cache]" Dec 16 07:42:40 crc kubenswrapper[4823]: E1216 07:42:40.012499 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b4462_a27e_46d6_9b58_1e30042b53e5.slice\": RecentStats: unable to find data in memory cache]" Dec 16 07:42:50 crc kubenswrapper[4823]: E1216 07:42:50.191659 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b4462_a27e_46d6_9b58_1e30042b53e5.slice\": RecentStats: unable to find data in memory cache]" Dec 16 07:42:58 crc kubenswrapper[4823]: I1216 07:42:58.134526 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:42:58 crc kubenswrapper[4823]: I1216 07:42:58.135214 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:43:00 crc kubenswrapper[4823]: E1216 07:43:00.384566 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b4462_a27e_46d6_9b58_1e30042b53e5.slice\": RecentStats: unable to find data in memory cache]" Dec 16 07:43:10 crc kubenswrapper[4823]: E1216 07:43:10.566511 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b4462_a27e_46d6_9b58_1e30042b53e5.slice\": RecentStats: unable to find data in memory cache]" Dec 16 07:43:20 crc kubenswrapper[4823]: E1216 07:43:20.760995 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11b4462_a27e_46d6_9b58_1e30042b53e5.slice\": RecentStats: unable to find data in memory cache]" Dec 16 07:43:28 crc kubenswrapper[4823]: I1216 07:43:28.133892 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:43:28 crc kubenswrapper[4823]: I1216 07:43:28.134356 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.133832 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.134576 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.134640 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.135478 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.135620 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" gracePeriod=600 Dec 16 07:43:58 crc kubenswrapper[4823]: E1216 07:43:58.257926 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.452241 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" exitCode=0 Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.452290 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7"} Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.452328 4823 scope.go:117] "RemoveContainer" containerID="c834dad71dd152fdffca68bf2b931bcb63c99e897f038cd6ad832b07707115a5" Dec 16 07:43:58 crc kubenswrapper[4823]: I1216 07:43:58.452791 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:43:58 crc kubenswrapper[4823]: E1216 07:43:58.452995 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:44:11 crc kubenswrapper[4823]: I1216 07:44:11.775945 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:44:11 crc kubenswrapper[4823]: E1216 07:44:11.776436 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:44:24 crc kubenswrapper[4823]: I1216 07:44:24.772166 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:44:24 crc kubenswrapper[4823]: E1216 07:44:24.772705 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:44:36 crc kubenswrapper[4823]: I1216 07:44:36.771303 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:44:36 crc kubenswrapper[4823]: E1216 07:44:36.772229 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.225311 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6zx4p"] Dec 16 07:44:50 crc kubenswrapper[4823]: E1216 07:44:50.229723 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="extract-utilities" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.229976 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="extract-utilities" Dec 16 07:44:50 crc kubenswrapper[4823]: E1216 07:44:50.230212 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="registry-server" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.230322 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="registry-server" Dec 16 07:44:50 crc kubenswrapper[4823]: E1216 07:44:50.230407 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="extract-content" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.230480 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="extract-content" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.230737 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11b4462-a27e-46d6-9b58-1e30042b53e5" containerName="registry-server" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.232018 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.244820 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zx4p"] Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.318211 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twv8j\" (UniqueName: \"kubernetes.io/projected/28a504fe-f44c-4b99-a2a7-a09d7eb15017-kube-api-access-twv8j\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.318281 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-utilities\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.318394 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-catalog-content\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.419722 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-catalog-content\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.419932 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twv8j\" (UniqueName: \"kubernetes.io/projected/28a504fe-f44c-4b99-a2a7-a09d7eb15017-kube-api-access-twv8j\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.419986 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-utilities\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.420401 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-catalog-content\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.420714 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-utilities\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.449466 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twv8j\" (UniqueName: \"kubernetes.io/projected/28a504fe-f44c-4b99-a2a7-a09d7eb15017-kube-api-access-twv8j\") pod \"community-operators-6zx4p\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.577707 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:44:50 crc kubenswrapper[4823]: I1216 07:44:50.776085 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:44:50 crc kubenswrapper[4823]: E1216 07:44:50.776490 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:44:51 crc kubenswrapper[4823]: I1216 07:44:51.098398 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zx4p"] Dec 16 07:44:51 crc kubenswrapper[4823]: I1216 07:44:51.888577 4823 generic.go:334] "Generic (PLEG): container finished" podID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerID="34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e" exitCode=0 Dec 16 07:44:51 crc kubenswrapper[4823]: I1216 07:44:51.888677 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zx4p" event={"ID":"28a504fe-f44c-4b99-a2a7-a09d7eb15017","Type":"ContainerDied","Data":"34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e"} Dec 16 07:44:51 crc kubenswrapper[4823]: I1216 07:44:51.890014 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zx4p" event={"ID":"28a504fe-f44c-4b99-a2a7-a09d7eb15017","Type":"ContainerStarted","Data":"5da893943d34e19e7ffe3d54a44d8a37dd44c674cd240a4cc27473ad725362c4"} Dec 16 07:44:53 crc kubenswrapper[4823]: I1216 07:44:53.913308 4823 generic.go:334] "Generic (PLEG): container finished" podID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerID="5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554" exitCode=0 Dec 16 07:44:53 crc kubenswrapper[4823]: I1216 07:44:53.913405 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zx4p" event={"ID":"28a504fe-f44c-4b99-a2a7-a09d7eb15017","Type":"ContainerDied","Data":"5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554"} Dec 16 07:44:54 crc kubenswrapper[4823]: I1216 07:44:54.925164 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zx4p" event={"ID":"28a504fe-f44c-4b99-a2a7-a09d7eb15017","Type":"ContainerStarted","Data":"759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e"} Dec 16 07:44:54 crc kubenswrapper[4823]: I1216 07:44:54.944503 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6zx4p" podStartSLOduration=2.453302725 podStartE2EDuration="4.944470885s" podCreationTimestamp="2025-12-16 07:44:50 +0000 UTC" firstStartedPulling="2025-12-16 07:44:51.890576343 +0000 UTC m=+2970.379142486" lastFinishedPulling="2025-12-16 07:44:54.381744513 +0000 UTC m=+2972.870310646" observedRunningTime="2025-12-16 07:44:54.942984009 +0000 UTC m=+2973.431550202" watchObservedRunningTime="2025-12-16 07:44:54.944470885 +0000 UTC m=+2973.433037058" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.158054 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv"] Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.161227 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.163823 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.165247 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.175886 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv"] Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.265171 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd17a84c-a98c-4f2b-ae73-32d888461931-config-volume\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.265277 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkbvn\" (UniqueName: \"kubernetes.io/projected/bd17a84c-a98c-4f2b-ae73-32d888461931-kube-api-access-pkbvn\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.265385 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd17a84c-a98c-4f2b-ae73-32d888461931-secret-volume\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.366841 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd17a84c-a98c-4f2b-ae73-32d888461931-secret-volume\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.367417 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd17a84c-a98c-4f2b-ae73-32d888461931-config-volume\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.367505 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkbvn\" (UniqueName: \"kubernetes.io/projected/bd17a84c-a98c-4f2b-ae73-32d888461931-kube-api-access-pkbvn\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.368327 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd17a84c-a98c-4f2b-ae73-32d888461931-config-volume\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.375550 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd17a84c-a98c-4f2b-ae73-32d888461931-secret-volume\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.388651 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkbvn\" (UniqueName: \"kubernetes.io/projected/bd17a84c-a98c-4f2b-ae73-32d888461931-kube-api-access-pkbvn\") pod \"collect-profiles-29431185-zw7gv\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.480615 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.579197 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.579646 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.631788 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.933784 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv"] Dec 16 07:45:00 crc kubenswrapper[4823]: I1216 07:45:00.974071 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" event={"ID":"bd17a84c-a98c-4f2b-ae73-32d888461931","Type":"ContainerStarted","Data":"d7e0e0db5893de3dc419907355f2054af7b1ca3066873759f74da36b8a3cd18a"} Dec 16 07:45:01 crc kubenswrapper[4823]: I1216 07:45:01.033591 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:45:01 crc kubenswrapper[4823]: I1216 07:45:01.085829 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zx4p"] Dec 16 07:45:01 crc kubenswrapper[4823]: I1216 07:45:01.986422 4823 generic.go:334] "Generic (PLEG): container finished" podID="bd17a84c-a98c-4f2b-ae73-32d888461931" containerID="970f3a5ca4f8d0fd63741d179774e18a0052bce79bbb50353d9ebc7979c37ff3" exitCode=0 Dec 16 07:45:01 crc kubenswrapper[4823]: I1216 07:45:01.986617 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" event={"ID":"bd17a84c-a98c-4f2b-ae73-32d888461931","Type":"ContainerDied","Data":"970f3a5ca4f8d0fd63741d179774e18a0052bce79bbb50353d9ebc7979c37ff3"} Dec 16 07:45:02 crc kubenswrapper[4823]: I1216 07:45:02.995693 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6zx4p" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="registry-server" containerID="cri-o://759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e" gracePeriod=2 Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.243955 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.409602 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd17a84c-a98c-4f2b-ae73-32d888461931-secret-volume\") pod \"bd17a84c-a98c-4f2b-ae73-32d888461931\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.410957 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkbvn\" (UniqueName: \"kubernetes.io/projected/bd17a84c-a98c-4f2b-ae73-32d888461931-kube-api-access-pkbvn\") pod \"bd17a84c-a98c-4f2b-ae73-32d888461931\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.411108 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd17a84c-a98c-4f2b-ae73-32d888461931-config-volume\") pod \"bd17a84c-a98c-4f2b-ae73-32d888461931\" (UID: \"bd17a84c-a98c-4f2b-ae73-32d888461931\") " Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.412136 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd17a84c-a98c-4f2b-ae73-32d888461931-config-volume" (OuterVolumeSpecName: "config-volume") pod "bd17a84c-a98c-4f2b-ae73-32d888461931" (UID: "bd17a84c-a98c-4f2b-ae73-32d888461931"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.417253 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd17a84c-a98c-4f2b-ae73-32d888461931-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bd17a84c-a98c-4f2b-ae73-32d888461931" (UID: "bd17a84c-a98c-4f2b-ae73-32d888461931"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.417508 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd17a84c-a98c-4f2b-ae73-32d888461931-kube-api-access-pkbvn" (OuterVolumeSpecName: "kube-api-access-pkbvn") pod "bd17a84c-a98c-4f2b-ae73-32d888461931" (UID: "bd17a84c-a98c-4f2b-ae73-32d888461931"). InnerVolumeSpecName "kube-api-access-pkbvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.471730 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.513236 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd17a84c-a98c-4f2b-ae73-32d888461931-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.513469 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd17a84c-a98c-4f2b-ae73-32d888461931-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.513594 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkbvn\" (UniqueName: \"kubernetes.io/projected/bd17a84c-a98c-4f2b-ae73-32d888461931-kube-api-access-pkbvn\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.614434 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-utilities\") pod \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.614817 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-catalog-content\") pod \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.614898 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twv8j\" (UniqueName: \"kubernetes.io/projected/28a504fe-f44c-4b99-a2a7-a09d7eb15017-kube-api-access-twv8j\") pod \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\" (UID: \"28a504fe-f44c-4b99-a2a7-a09d7eb15017\") " Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.616365 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-utilities" (OuterVolumeSpecName: "utilities") pod "28a504fe-f44c-4b99-a2a7-a09d7eb15017" (UID: "28a504fe-f44c-4b99-a2a7-a09d7eb15017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.618242 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a504fe-f44c-4b99-a2a7-a09d7eb15017-kube-api-access-twv8j" (OuterVolumeSpecName: "kube-api-access-twv8j") pod "28a504fe-f44c-4b99-a2a7-a09d7eb15017" (UID: "28a504fe-f44c-4b99-a2a7-a09d7eb15017"). InnerVolumeSpecName "kube-api-access-twv8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.716842 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.717110 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twv8j\" (UniqueName: \"kubernetes.io/projected/28a504fe-f44c-4b99-a2a7-a09d7eb15017-kube-api-access-twv8j\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.773248 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:45:03 crc kubenswrapper[4823]: E1216 07:45:03.773841 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.806004 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28a504fe-f44c-4b99-a2a7-a09d7eb15017" (UID: "28a504fe-f44c-4b99-a2a7-a09d7eb15017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4823]: I1216 07:45:03.818633 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a504fe-f44c-4b99-a2a7-a09d7eb15017-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.005193 4823 generic.go:334] "Generic (PLEG): container finished" podID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerID="759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e" exitCode=0 Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.005226 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zx4p" event={"ID":"28a504fe-f44c-4b99-a2a7-a09d7eb15017","Type":"ContainerDied","Data":"759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e"} Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.005294 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zx4p" event={"ID":"28a504fe-f44c-4b99-a2a7-a09d7eb15017","Type":"ContainerDied","Data":"5da893943d34e19e7ffe3d54a44d8a37dd44c674cd240a4cc27473ad725362c4"} Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.005306 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zx4p" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.005316 4823 scope.go:117] "RemoveContainer" containerID="759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.008281 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" event={"ID":"bd17a84c-a98c-4f2b-ae73-32d888461931","Type":"ContainerDied","Data":"d7e0e0db5893de3dc419907355f2054af7b1ca3066873759f74da36b8a3cd18a"} Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.008313 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e0e0db5893de3dc419907355f2054af7b1ca3066873759f74da36b8a3cd18a" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.008371 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.033973 4823 scope.go:117] "RemoveContainer" containerID="5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.057473 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zx4p"] Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.063125 4823 scope.go:117] "RemoveContainer" containerID="34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.066400 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6zx4p"] Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.089188 4823 scope.go:117] "RemoveContainer" containerID="759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e" Dec 16 07:45:04 crc kubenswrapper[4823]: E1216 07:45:04.089606 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e\": container with ID starting with 759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e not found: ID does not exist" containerID="759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.089636 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e"} err="failed to get container status \"759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e\": rpc error: code = NotFound desc = could not find container \"759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e\": container with ID starting with 759d3dc2aeb56e970f98b7076284e546298d7de8d4dfae7513a9a8b15ba4953e not found: ID does not exist" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.089655 4823 scope.go:117] "RemoveContainer" containerID="5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554" Dec 16 07:45:04 crc kubenswrapper[4823]: E1216 07:45:04.089895 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554\": container with ID starting with 5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554 not found: ID does not exist" containerID="5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.089917 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554"} err="failed to get container status \"5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554\": rpc error: code = NotFound desc = could not find container \"5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554\": container with ID starting with 5ee7d29933c2a2ea66f654cc039a75bb11004487ba209de633fbebc5a9dd5554 not found: ID does not exist" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.089934 4823 scope.go:117] "RemoveContainer" containerID="34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e" Dec 16 07:45:04 crc kubenswrapper[4823]: E1216 07:45:04.090409 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e\": container with ID starting with 34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e not found: ID does not exist" containerID="34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.090440 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e"} err="failed to get container status \"34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e\": rpc error: code = NotFound desc = could not find container \"34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e\": container with ID starting with 34866379d1d9e2c456d8900390654eafb9eaf1da211930499a9200e6fafa383e not found: ID does not exist" Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.305562 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd"] Dec 16 07:45:04 crc kubenswrapper[4823]: I1216 07:45:04.310599 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-v9vtd"] Dec 16 07:45:05 crc kubenswrapper[4823]: I1216 07:45:05.782200 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" path="/var/lib/kubelet/pods/28a504fe-f44c-4b99-a2a7-a09d7eb15017/volumes" Dec 16 07:45:05 crc kubenswrapper[4823]: I1216 07:45:05.783072 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb000cb4-8b05-4ee9-b6dd-c8797099232b" path="/var/lib/kubelet/pods/fb000cb4-8b05-4ee9-b6dd-c8797099232b/volumes" Dec 16 07:45:17 crc kubenswrapper[4823]: I1216 07:45:17.771701 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:45:17 crc kubenswrapper[4823]: E1216 07:45:17.772375 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:45:29 crc kubenswrapper[4823]: I1216 07:45:29.772317 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:45:29 crc kubenswrapper[4823]: E1216 07:45:29.773280 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:45:40 crc kubenswrapper[4823]: I1216 07:45:40.700184 4823 scope.go:117] "RemoveContainer" containerID="12be26a45e3fc2e61b7ddf0faab639b15d425610e1a28f68a3617f4f7dfa5a3a" Dec 16 07:45:40 crc kubenswrapper[4823]: I1216 07:45:40.786910 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:45:40 crc kubenswrapper[4823]: E1216 07:45:40.787589 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:45:51 crc kubenswrapper[4823]: I1216 07:45:51.776248 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:45:51 crc kubenswrapper[4823]: E1216 07:45:51.776784 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:46:05 crc kubenswrapper[4823]: I1216 07:46:05.772131 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:46:05 crc kubenswrapper[4823]: E1216 07:46:05.772822 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:46:20 crc kubenswrapper[4823]: I1216 07:46:20.772335 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:46:20 crc kubenswrapper[4823]: E1216 07:46:20.773004 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:46:35 crc kubenswrapper[4823]: I1216 07:46:35.772138 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:46:35 crc kubenswrapper[4823]: E1216 07:46:35.772947 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:46:48 crc kubenswrapper[4823]: I1216 07:46:48.771773 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:46:48 crc kubenswrapper[4823]: E1216 07:46:48.772658 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:47:01 crc kubenswrapper[4823]: I1216 07:47:01.775551 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:47:01 crc kubenswrapper[4823]: E1216 07:47:01.776327 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:47:15 crc kubenswrapper[4823]: I1216 07:47:15.771850 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:47:15 crc kubenswrapper[4823]: E1216 07:47:15.772346 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:47:29 crc kubenswrapper[4823]: I1216 07:47:29.772351 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:47:29 crc kubenswrapper[4823]: E1216 07:47:29.773279 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:47:42 crc kubenswrapper[4823]: I1216 07:47:42.772138 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:47:42 crc kubenswrapper[4823]: E1216 07:47:42.772850 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:47:56 crc kubenswrapper[4823]: I1216 07:47:56.772179 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:47:56 crc kubenswrapper[4823]: E1216 07:47:56.772943 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:48:09 crc kubenswrapper[4823]: I1216 07:48:09.772371 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:48:09 crc kubenswrapper[4823]: E1216 07:48:09.773310 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:48:20 crc kubenswrapper[4823]: I1216 07:48:20.772496 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:48:20 crc kubenswrapper[4823]: E1216 07:48:20.773661 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:48:35 crc kubenswrapper[4823]: I1216 07:48:35.772838 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:48:35 crc kubenswrapper[4823]: E1216 07:48:35.774151 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:48:49 crc kubenswrapper[4823]: I1216 07:48:49.772466 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:48:49 crc kubenswrapper[4823]: E1216 07:48:49.773669 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:49:01 crc kubenswrapper[4823]: I1216 07:49:01.779696 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:49:02 crc kubenswrapper[4823]: I1216 07:49:02.884209 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"6fb65469102dc2763b9ed51e6c572bbdf3d8b391bede2d8bd15a21f31bcf64a9"} Dec 16 07:51:28 crc kubenswrapper[4823]: I1216 07:51:28.134703 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:51:28 crc kubenswrapper[4823]: I1216 07:51:28.135612 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.769133 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9fr4v"] Dec 16 07:51:47 crc kubenswrapper[4823]: E1216 07:51:47.770141 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="extract-content" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.770157 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="extract-content" Dec 16 07:51:47 crc kubenswrapper[4823]: E1216 07:51:47.770187 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="registry-server" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.770195 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="registry-server" Dec 16 07:51:47 crc kubenswrapper[4823]: E1216 07:51:47.770214 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="extract-utilities" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.770223 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="extract-utilities" Dec 16 07:51:47 crc kubenswrapper[4823]: E1216 07:51:47.770237 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd17a84c-a98c-4f2b-ae73-32d888461931" containerName="collect-profiles" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.770245 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd17a84c-a98c-4f2b-ae73-32d888461931" containerName="collect-profiles" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.770423 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd17a84c-a98c-4f2b-ae73-32d888461931" containerName="collect-profiles" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.770440 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a504fe-f44c-4b99-a2a7-a09d7eb15017" containerName="registry-server" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.771702 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.791612 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fr4v"] Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.952754 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-utilities\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.953301 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-catalog-content\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:47 crc kubenswrapper[4823]: I1216 07:51:47.953374 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfqj\" (UniqueName: \"kubernetes.io/projected/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-kube-api-access-nwfqj\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.055252 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfqj\" (UniqueName: \"kubernetes.io/projected/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-kube-api-access-nwfqj\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.055395 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-utilities\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.055434 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-catalog-content\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.056000 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-utilities\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.056095 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-catalog-content\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.081369 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfqj\" (UniqueName: \"kubernetes.io/projected/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-kube-api-access-nwfqj\") pod \"certified-operators-9fr4v\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.096918 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:48 crc kubenswrapper[4823]: I1216 07:51:48.561605 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9fr4v"] Dec 16 07:51:49 crc kubenswrapper[4823]: I1216 07:51:49.243530 4823 generic.go:334] "Generic (PLEG): container finished" podID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerID="a102393ec4a38b069b8395eef63b3088287945216870aa45c5ff2fd1ff7cb630" exitCode=0 Dec 16 07:51:49 crc kubenswrapper[4823]: I1216 07:51:49.243602 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerDied","Data":"a102393ec4a38b069b8395eef63b3088287945216870aa45c5ff2fd1ff7cb630"} Dec 16 07:51:49 crc kubenswrapper[4823]: I1216 07:51:49.243889 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerStarted","Data":"cb0210d639de5d85c054b8c2de3b1a4bf9e7563266ace3cb167fd9ae8f92bc23"} Dec 16 07:51:49 crc kubenswrapper[4823]: I1216 07:51:49.250066 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:51:50 crc kubenswrapper[4823]: I1216 07:51:50.254363 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerStarted","Data":"afda43cf92989c2a1a080700a613b8d47248c5dd55f89fe5ec2edf6233a3f66b"} Dec 16 07:51:51 crc kubenswrapper[4823]: I1216 07:51:51.264508 4823 generic.go:334] "Generic (PLEG): container finished" podID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerID="afda43cf92989c2a1a080700a613b8d47248c5dd55f89fe5ec2edf6233a3f66b" exitCode=0 Dec 16 07:51:51 crc kubenswrapper[4823]: I1216 07:51:51.264636 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerDied","Data":"afda43cf92989c2a1a080700a613b8d47248c5dd55f89fe5ec2edf6233a3f66b"} Dec 16 07:51:52 crc kubenswrapper[4823]: I1216 07:51:52.274310 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerStarted","Data":"cd724e04d50d235f8e164732400d6d3feaba051215f63899380b8b2dbff32cfb"} Dec 16 07:51:52 crc kubenswrapper[4823]: I1216 07:51:52.293960 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9fr4v" podStartSLOduration=2.829989775 podStartE2EDuration="5.293938832s" podCreationTimestamp="2025-12-16 07:51:47 +0000 UTC" firstStartedPulling="2025-12-16 07:51:49.24962935 +0000 UTC m=+3387.738195483" lastFinishedPulling="2025-12-16 07:51:51.713578417 +0000 UTC m=+3390.202144540" observedRunningTime="2025-12-16 07:51:52.288524452 +0000 UTC m=+3390.777090575" watchObservedRunningTime="2025-12-16 07:51:52.293938832 +0000 UTC m=+3390.782504955" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.154392 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ntq9f"] Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.155881 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.166715 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntq9f"] Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.254340 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9t87\" (UniqueName: \"kubernetes.io/projected/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-kube-api-access-g9t87\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.254510 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-utilities\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.254814 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-catalog-content\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.356544 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-catalog-content\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.356621 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9t87\" (UniqueName: \"kubernetes.io/projected/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-kube-api-access-g9t87\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.356680 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-utilities\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.357171 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-catalog-content\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.357293 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-utilities\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.384634 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9t87\" (UniqueName: \"kubernetes.io/projected/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-kube-api-access-g9t87\") pod \"redhat-operators-ntq9f\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.492774 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:51:54 crc kubenswrapper[4823]: I1216 07:51:54.919258 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntq9f"] Dec 16 07:51:54 crc kubenswrapper[4823]: W1216 07:51:54.928139 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54fff1b4_dbb7_4b3b_b099_6bdf2065f4c7.slice/crio-259cbf8b8a8e71234fc0fea86367d7de8005e9e81b865d710828addd52ee9a3e WatchSource:0}: Error finding container 259cbf8b8a8e71234fc0fea86367d7de8005e9e81b865d710828addd52ee9a3e: Status 404 returned error can't find the container with id 259cbf8b8a8e71234fc0fea86367d7de8005e9e81b865d710828addd52ee9a3e Dec 16 07:51:55 crc kubenswrapper[4823]: I1216 07:51:55.295874 4823 generic.go:334] "Generic (PLEG): container finished" podID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerID="418869e909a5bb32fefcf65800278ffb791024d89a95d281e820bf4c3fa0e40d" exitCode=0 Dec 16 07:51:55 crc kubenswrapper[4823]: I1216 07:51:55.295930 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntq9f" event={"ID":"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7","Type":"ContainerDied","Data":"418869e909a5bb32fefcf65800278ffb791024d89a95d281e820bf4c3fa0e40d"} Dec 16 07:51:55 crc kubenswrapper[4823]: I1216 07:51:55.295956 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntq9f" event={"ID":"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7","Type":"ContainerStarted","Data":"259cbf8b8a8e71234fc0fea86367d7de8005e9e81b865d710828addd52ee9a3e"} Dec 16 07:51:57 crc kubenswrapper[4823]: I1216 07:51:57.318256 4823 generic.go:334] "Generic (PLEG): container finished" podID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerID="370a079cba701c88c6b8cecc56bc99c61e47c8b7c9b4c90db19f3c49511f541f" exitCode=0 Dec 16 07:51:57 crc kubenswrapper[4823]: I1216 07:51:57.318396 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntq9f" event={"ID":"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7","Type":"ContainerDied","Data":"370a079cba701c88c6b8cecc56bc99c61e47c8b7c9b4c90db19f3c49511f541f"} Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.097735 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.097789 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.134001 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.134093 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.146735 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.354085 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntq9f" event={"ID":"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7","Type":"ContainerStarted","Data":"1dd3283c784ec5cd6ff1f67918745f056022f2f5c6f572982d0be9a0148f6d41"} Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.405731 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ntq9f" podStartSLOduration=1.911571428 podStartE2EDuration="4.405712432s" podCreationTimestamp="2025-12-16 07:51:54 +0000 UTC" firstStartedPulling="2025-12-16 07:51:55.297372572 +0000 UTC m=+3393.785938695" lastFinishedPulling="2025-12-16 07:51:57.791513576 +0000 UTC m=+3396.280079699" observedRunningTime="2025-12-16 07:51:58.402066707 +0000 UTC m=+3396.890632830" watchObservedRunningTime="2025-12-16 07:51:58.405712432 +0000 UTC m=+3396.894278575" Dec 16 07:51:58 crc kubenswrapper[4823]: I1216 07:51:58.432706 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:52:00 crc kubenswrapper[4823]: I1216 07:52:00.542184 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fr4v"] Dec 16 07:52:00 crc kubenswrapper[4823]: I1216 07:52:00.542732 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9fr4v" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="registry-server" containerID="cri-o://cd724e04d50d235f8e164732400d6d3feaba051215f63899380b8b2dbff32cfb" gracePeriod=2 Dec 16 07:52:01 crc kubenswrapper[4823]: I1216 07:52:01.378418 4823 generic.go:334] "Generic (PLEG): container finished" podID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerID="cd724e04d50d235f8e164732400d6d3feaba051215f63899380b8b2dbff32cfb" exitCode=0 Dec 16 07:52:01 crc kubenswrapper[4823]: I1216 07:52:01.378464 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerDied","Data":"cd724e04d50d235f8e164732400d6d3feaba051215f63899380b8b2dbff32cfb"} Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.095180 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.180735 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-catalog-content\") pod \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.180876 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-utilities\") pod \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.180914 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfqj\" (UniqueName: \"kubernetes.io/projected/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-kube-api-access-nwfqj\") pod \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\" (UID: \"90e3e315-6fe8-40aa-bf24-2d34c1d7494c\") " Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.182052 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-utilities" (OuterVolumeSpecName: "utilities") pod "90e3e315-6fe8-40aa-bf24-2d34c1d7494c" (UID: "90e3e315-6fe8-40aa-bf24-2d34c1d7494c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.192252 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-kube-api-access-nwfqj" (OuterVolumeSpecName: "kube-api-access-nwfqj") pod "90e3e315-6fe8-40aa-bf24-2d34c1d7494c" (UID: "90e3e315-6fe8-40aa-bf24-2d34c1d7494c"). InnerVolumeSpecName "kube-api-access-nwfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.232493 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90e3e315-6fe8-40aa-bf24-2d34c1d7494c" (UID: "90e3e315-6fe8-40aa-bf24-2d34c1d7494c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.282694 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.282740 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.282756 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfqj\" (UniqueName: \"kubernetes.io/projected/90e3e315-6fe8-40aa-bf24-2d34c1d7494c-kube-api-access-nwfqj\") on node \"crc\" DevicePath \"\"" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.391876 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9fr4v" event={"ID":"90e3e315-6fe8-40aa-bf24-2d34c1d7494c","Type":"ContainerDied","Data":"cb0210d639de5d85c054b8c2de3b1a4bf9e7563266ace3cb167fd9ae8f92bc23"} Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.391962 4823 scope.go:117] "RemoveContainer" containerID="cd724e04d50d235f8e164732400d6d3feaba051215f63899380b8b2dbff32cfb" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.392592 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9fr4v" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.437043 4823 scope.go:117] "RemoveContainer" containerID="afda43cf92989c2a1a080700a613b8d47248c5dd55f89fe5ec2edf6233a3f66b" Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.438646 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9fr4v"] Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.447315 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9fr4v"] Dec 16 07:52:02 crc kubenswrapper[4823]: I1216 07:52:02.455112 4823 scope.go:117] "RemoveContainer" containerID="a102393ec4a38b069b8395eef63b3088287945216870aa45c5ff2fd1ff7cb630" Dec 16 07:52:03 crc kubenswrapper[4823]: I1216 07:52:03.783012 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" path="/var/lib/kubelet/pods/90e3e315-6fe8-40aa-bf24-2d34c1d7494c/volumes" Dec 16 07:52:04 crc kubenswrapper[4823]: I1216 07:52:04.493376 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:52:04 crc kubenswrapper[4823]: I1216 07:52:04.493463 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:52:04 crc kubenswrapper[4823]: I1216 07:52:04.531005 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:52:05 crc kubenswrapper[4823]: I1216 07:52:05.465514 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:52:06 crc kubenswrapper[4823]: I1216 07:52:06.144597 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntq9f"] Dec 16 07:52:07 crc kubenswrapper[4823]: I1216 07:52:07.427348 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ntq9f" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="registry-server" containerID="cri-o://1dd3283c784ec5cd6ff1f67918745f056022f2f5c6f572982d0be9a0148f6d41" gracePeriod=2 Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.454372 4823 generic.go:334] "Generic (PLEG): container finished" podID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerID="1dd3283c784ec5cd6ff1f67918745f056022f2f5c6f572982d0be9a0148f6d41" exitCode=0 Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.454461 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntq9f" event={"ID":"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7","Type":"ContainerDied","Data":"1dd3283c784ec5cd6ff1f67918745f056022f2f5c6f572982d0be9a0148f6d41"} Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.454967 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntq9f" event={"ID":"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7","Type":"ContainerDied","Data":"259cbf8b8a8e71234fc0fea86367d7de8005e9e81b865d710828addd52ee9a3e"} Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.454983 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259cbf8b8a8e71234fc0fea86367d7de8005e9e81b865d710828addd52ee9a3e" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.478009 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.599801 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9t87\" (UniqueName: \"kubernetes.io/projected/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-kube-api-access-g9t87\") pod \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.600989 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-utilities\") pod \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.601189 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-catalog-content\") pod \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\" (UID: \"54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7\") " Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.601764 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-utilities" (OuterVolumeSpecName: "utilities") pod "54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" (UID: "54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.607277 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-kube-api-access-g9t87" (OuterVolumeSpecName: "kube-api-access-g9t87") pod "54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" (UID: "54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7"). InnerVolumeSpecName "kube-api-access-g9t87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.703002 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9t87\" (UniqueName: \"kubernetes.io/projected/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-kube-api-access-g9t87\") on node \"crc\" DevicePath \"\"" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.703275 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.726654 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" (UID: "54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:52:10 crc kubenswrapper[4823]: I1216 07:52:10.804175 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:52:11 crc kubenswrapper[4823]: I1216 07:52:11.461713 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntq9f" Dec 16 07:52:11 crc kubenswrapper[4823]: I1216 07:52:11.498753 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntq9f"] Dec 16 07:52:11 crc kubenswrapper[4823]: I1216 07:52:11.508673 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ntq9f"] Dec 16 07:52:11 crc kubenswrapper[4823]: I1216 07:52:11.781791 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" path="/var/lib/kubelet/pods/54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7/volumes" Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.133957 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.134870 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.134935 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.135956 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fb65469102dc2763b9ed51e6c572bbdf3d8b391bede2d8bd15a21f31bcf64a9"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.136076 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://6fb65469102dc2763b9ed51e6c572bbdf3d8b391bede2d8bd15a21f31bcf64a9" gracePeriod=600 Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.763911 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="6fb65469102dc2763b9ed51e6c572bbdf3d8b391bede2d8bd15a21f31bcf64a9" exitCode=0 Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.763956 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"6fb65469102dc2763b9ed51e6c572bbdf3d8b391bede2d8bd15a21f31bcf64a9"} Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.764664 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb"} Dec 16 07:52:28 crc kubenswrapper[4823]: I1216 07:52:28.764754 4823 scope.go:117] "RemoveContainer" containerID="8122cdf24d33128bdb673858f354f06ec5dffb9a8e2b01561997db4f050635c7" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.118468 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gtfq8"] Dec 16 07:53:01 crc kubenswrapper[4823]: E1216 07:53:01.119513 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="extract-utilities" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119532 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="extract-utilities" Dec 16 07:53:01 crc kubenswrapper[4823]: E1216 07:53:01.119545 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="extract-content" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119553 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="extract-content" Dec 16 07:53:01 crc kubenswrapper[4823]: E1216 07:53:01.119563 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119573 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4823]: E1216 07:53:01.119600 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="extract-content" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119607 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="extract-content" Dec 16 07:53:01 crc kubenswrapper[4823]: E1216 07:53:01.119622 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="extract-utilities" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119628 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="extract-utilities" Dec 16 07:53:01 crc kubenswrapper[4823]: E1216 07:53:01.119645 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119654 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119855 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e3e315-6fe8-40aa-bf24-2d34c1d7494c" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.119903 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="54fff1b4-dbb7-4b3b-b099-6bdf2065f4c7" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.121114 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.132819 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtfq8"] Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.299539 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-catalog-content\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.299687 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxqm\" (UniqueName: \"kubernetes.io/projected/bcb243b7-6c2e-4def-ac57-fd942990e69b-kube-api-access-kpxqm\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.299816 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-utilities\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.400576 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-utilities\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.400616 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-catalog-content\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.400676 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxqm\" (UniqueName: \"kubernetes.io/projected/bcb243b7-6c2e-4def-ac57-fd942990e69b-kube-api-access-kpxqm\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.401121 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-utilities\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.401201 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-catalog-content\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.437953 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxqm\" (UniqueName: \"kubernetes.io/projected/bcb243b7-6c2e-4def-ac57-fd942990e69b-kube-api-access-kpxqm\") pod \"redhat-marketplace-gtfq8\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.445129 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:01 crc kubenswrapper[4823]: I1216 07:53:01.923249 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtfq8"] Dec 16 07:53:02 crc kubenswrapper[4823]: I1216 07:53:02.026813 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerStarted","Data":"2b5380eb3ee08c52d41921ac721fdf24f39adf43856f13a446db401892ac5a4f"} Dec 16 07:53:03 crc kubenswrapper[4823]: I1216 07:53:03.038601 4823 generic.go:334] "Generic (PLEG): container finished" podID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerID="f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750" exitCode=0 Dec 16 07:53:03 crc kubenswrapper[4823]: I1216 07:53:03.038752 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerDied","Data":"f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750"} Dec 16 07:53:04 crc kubenswrapper[4823]: I1216 07:53:04.051264 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerStarted","Data":"ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a"} Dec 16 07:53:05 crc kubenswrapper[4823]: I1216 07:53:05.062914 4823 generic.go:334] "Generic (PLEG): container finished" podID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerID="ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a" exitCode=0 Dec 16 07:53:05 crc kubenswrapper[4823]: I1216 07:53:05.062993 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerDied","Data":"ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a"} Dec 16 07:53:06 crc kubenswrapper[4823]: I1216 07:53:06.075913 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerStarted","Data":"9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca"} Dec 16 07:53:06 crc kubenswrapper[4823]: I1216 07:53:06.099511 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gtfq8" podStartSLOduration=2.267255509 podStartE2EDuration="5.099492054s" podCreationTimestamp="2025-12-16 07:53:01 +0000 UTC" firstStartedPulling="2025-12-16 07:53:03.041502053 +0000 UTC m=+3461.530068206" lastFinishedPulling="2025-12-16 07:53:05.873738588 +0000 UTC m=+3464.362304751" observedRunningTime="2025-12-16 07:53:06.096351905 +0000 UTC m=+3464.584918028" watchObservedRunningTime="2025-12-16 07:53:06.099492054 +0000 UTC m=+3464.588058197" Dec 16 07:53:11 crc kubenswrapper[4823]: I1216 07:53:11.445665 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:11 crc kubenswrapper[4823]: I1216 07:53:11.446494 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:11 crc kubenswrapper[4823]: I1216 07:53:11.491271 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:12 crc kubenswrapper[4823]: I1216 07:53:12.184833 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:12 crc kubenswrapper[4823]: I1216 07:53:12.232787 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtfq8"] Dec 16 07:53:14 crc kubenswrapper[4823]: I1216 07:53:14.136054 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gtfq8" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="registry-server" containerID="cri-o://9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca" gracePeriod=2 Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.141825 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.149275 4823 generic.go:334] "Generic (PLEG): container finished" podID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerID="9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca" exitCode=0 Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.149317 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerDied","Data":"9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca"} Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.149348 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gtfq8" event={"ID":"bcb243b7-6c2e-4def-ac57-fd942990e69b","Type":"ContainerDied","Data":"2b5380eb3ee08c52d41921ac721fdf24f39adf43856f13a446db401892ac5a4f"} Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.149366 4823 scope.go:117] "RemoveContainer" containerID="9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.149372 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gtfq8" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.174958 4823 scope.go:117] "RemoveContainer" containerID="ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.193365 4823 scope.go:117] "RemoveContainer" containerID="f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.211502 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxqm\" (UniqueName: \"kubernetes.io/projected/bcb243b7-6c2e-4def-ac57-fd942990e69b-kube-api-access-kpxqm\") pod \"bcb243b7-6c2e-4def-ac57-fd942990e69b\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.211577 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-utilities\") pod \"bcb243b7-6c2e-4def-ac57-fd942990e69b\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.211668 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-catalog-content\") pod \"bcb243b7-6c2e-4def-ac57-fd942990e69b\" (UID: \"bcb243b7-6c2e-4def-ac57-fd942990e69b\") " Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.213399 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-utilities" (OuterVolumeSpecName: "utilities") pod "bcb243b7-6c2e-4def-ac57-fd942990e69b" (UID: "bcb243b7-6c2e-4def-ac57-fd942990e69b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.217803 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb243b7-6c2e-4def-ac57-fd942990e69b-kube-api-access-kpxqm" (OuterVolumeSpecName: "kube-api-access-kpxqm") pod "bcb243b7-6c2e-4def-ac57-fd942990e69b" (UID: "bcb243b7-6c2e-4def-ac57-fd942990e69b"). InnerVolumeSpecName "kube-api-access-kpxqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.237345 4823 scope.go:117] "RemoveContainer" containerID="9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca" Dec 16 07:53:15 crc kubenswrapper[4823]: E1216 07:53:15.238309 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca\": container with ID starting with 9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca not found: ID does not exist" containerID="9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.238391 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca"} err="failed to get container status \"9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca\": rpc error: code = NotFound desc = could not find container \"9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca\": container with ID starting with 9f25de829f1336a84cf8ea44f010aef126a43beee62b7ec34a60c13e4864f6ca not found: ID does not exist" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.238437 4823 scope.go:117] "RemoveContainer" containerID="ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a" Dec 16 07:53:15 crc kubenswrapper[4823]: E1216 07:53:15.238819 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a\": container with ID starting with ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a not found: ID does not exist" containerID="ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.239289 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a"} err="failed to get container status \"ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a\": rpc error: code = NotFound desc = could not find container \"ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a\": container with ID starting with ca232ca7106db2d7f0940bd40dc25f077d5ff14afb098a2368562570f2ffec8a not found: ID does not exist" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.239318 4823 scope.go:117] "RemoveContainer" containerID="f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750" Dec 16 07:53:15 crc kubenswrapper[4823]: E1216 07:53:15.239631 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750\": container with ID starting with f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750 not found: ID does not exist" containerID="f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.239680 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750"} err="failed to get container status \"f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750\": rpc error: code = NotFound desc = could not find container \"f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750\": container with ID starting with f05d2709a7376873aa90fa0ddfe4bd8d40f29fd8f6ae5501001d120b3708c750 not found: ID does not exist" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.247819 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcb243b7-6c2e-4def-ac57-fd942990e69b" (UID: "bcb243b7-6c2e-4def-ac57-fd942990e69b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.313185 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.313226 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb243b7-6c2e-4def-ac57-fd942990e69b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.313238 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxqm\" (UniqueName: \"kubernetes.io/projected/bcb243b7-6c2e-4def-ac57-fd942990e69b-kube-api-access-kpxqm\") on node \"crc\" DevicePath \"\"" Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.489444 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtfq8"] Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.498535 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gtfq8"] Dec 16 07:53:15 crc kubenswrapper[4823]: I1216 07:53:15.782935 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" path="/var/lib/kubelet/pods/bcb243b7-6c2e-4def-ac57-fd942990e69b/volumes" Dec 16 07:54:28 crc kubenswrapper[4823]: I1216 07:54:28.155500 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:54:28 crc kubenswrapper[4823]: I1216 07:54:28.156092 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:54:58 crc kubenswrapper[4823]: I1216 07:54:58.133551 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:54:58 crc kubenswrapper[4823]: I1216 07:54:58.134078 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:55:28 crc kubenswrapper[4823]: I1216 07:55:28.134247 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:55:28 crc kubenswrapper[4823]: I1216 07:55:28.134842 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:55:28 crc kubenswrapper[4823]: I1216 07:55:28.134913 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 07:55:28 crc kubenswrapper[4823]: I1216 07:55:28.135768 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:55:28 crc kubenswrapper[4823]: I1216 07:55:28.135864 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" gracePeriod=600 Dec 16 07:55:28 crc kubenswrapper[4823]: E1216 07:55:28.270615 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:55:29 crc kubenswrapper[4823]: I1216 07:55:29.176784 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" exitCode=0 Dec 16 07:55:29 crc kubenswrapper[4823]: I1216 07:55:29.176859 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb"} Dec 16 07:55:29 crc kubenswrapper[4823]: I1216 07:55:29.176935 4823 scope.go:117] "RemoveContainer" containerID="6fb65469102dc2763b9ed51e6c572bbdf3d8b391bede2d8bd15a21f31bcf64a9" Dec 16 07:55:29 crc kubenswrapper[4823]: I1216 07:55:29.177699 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:55:29 crc kubenswrapper[4823]: E1216 07:55:29.178122 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:55:42 crc kubenswrapper[4823]: I1216 07:55:42.771171 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:55:42 crc kubenswrapper[4823]: E1216 07:55:42.771962 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:55:53 crc kubenswrapper[4823]: I1216 07:55:53.772123 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:55:53 crc kubenswrapper[4823]: E1216 07:55:53.772847 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:56:05 crc kubenswrapper[4823]: I1216 07:56:05.772439 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:56:05 crc kubenswrapper[4823]: E1216 07:56:05.773623 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:56:17 crc kubenswrapper[4823]: I1216 07:56:17.772079 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:56:17 crc kubenswrapper[4823]: E1216 07:56:17.772873 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.359026 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wprkf"] Dec 16 07:56:19 crc kubenswrapper[4823]: E1216 07:56:19.359807 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="extract-utilities" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.359822 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="extract-utilities" Dec 16 07:56:19 crc kubenswrapper[4823]: E1216 07:56:19.359858 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="extract-content" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.359866 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="extract-content" Dec 16 07:56:19 crc kubenswrapper[4823]: E1216 07:56:19.359880 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="registry-server" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.359890 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="registry-server" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.360111 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb243b7-6c2e-4def-ac57-fd942990e69b" containerName="registry-server" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.361301 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.393648 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wprkf"] Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.496791 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-catalog-content\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.496899 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqxr\" (UniqueName: \"kubernetes.io/projected/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-kube-api-access-zlqxr\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.496974 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-utilities\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.598544 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-utilities\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.599168 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-catalog-content\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.599501 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqxr\" (UniqueName: \"kubernetes.io/projected/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-kube-api-access-zlqxr\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.599650 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-utilities\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.599678 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-catalog-content\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.619634 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqxr\" (UniqueName: \"kubernetes.io/projected/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-kube-api-access-zlqxr\") pod \"community-operators-wprkf\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:19 crc kubenswrapper[4823]: I1216 07:56:19.683095 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:20 crc kubenswrapper[4823]: I1216 07:56:20.176064 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wprkf"] Dec 16 07:56:20 crc kubenswrapper[4823]: W1216 07:56:20.178090 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-7995aa783d0b0ef964abd88aaeca9016cbf4264f98ca91b50b64819254901a03 WatchSource:0}: Error finding container 7995aa783d0b0ef964abd88aaeca9016cbf4264f98ca91b50b64819254901a03: Status 404 returned error can't find the container with id 7995aa783d0b0ef964abd88aaeca9016cbf4264f98ca91b50b64819254901a03 Dec 16 07:56:20 crc kubenswrapper[4823]: I1216 07:56:20.591405 4823 generic.go:334] "Generic (PLEG): container finished" podID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerID="de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c" exitCode=0 Dec 16 07:56:20 crc kubenswrapper[4823]: I1216 07:56:20.591447 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerDied","Data":"de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c"} Dec 16 07:56:20 crc kubenswrapper[4823]: I1216 07:56:20.591471 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerStarted","Data":"7995aa783d0b0ef964abd88aaeca9016cbf4264f98ca91b50b64819254901a03"} Dec 16 07:56:21 crc kubenswrapper[4823]: I1216 07:56:21.601873 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerStarted","Data":"0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96"} Dec 16 07:56:22 crc kubenswrapper[4823]: I1216 07:56:22.612798 4823 generic.go:334] "Generic (PLEG): container finished" podID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerID="0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96" exitCode=0 Dec 16 07:56:22 crc kubenswrapper[4823]: I1216 07:56:22.613333 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerDied","Data":"0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96"} Dec 16 07:56:23 crc kubenswrapper[4823]: I1216 07:56:23.627450 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerStarted","Data":"8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12"} Dec 16 07:56:23 crc kubenswrapper[4823]: I1216 07:56:23.652759 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wprkf" podStartSLOduration=2.217038566 podStartE2EDuration="4.652737352s" podCreationTimestamp="2025-12-16 07:56:19 +0000 UTC" firstStartedPulling="2025-12-16 07:56:20.593284555 +0000 UTC m=+3659.081850678" lastFinishedPulling="2025-12-16 07:56:23.028983321 +0000 UTC m=+3661.517549464" observedRunningTime="2025-12-16 07:56:23.644451223 +0000 UTC m=+3662.133017356" watchObservedRunningTime="2025-12-16 07:56:23.652737352 +0000 UTC m=+3662.141303495" Dec 16 07:56:25 crc kubenswrapper[4823]: E1216 07:56:25.892905 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:56:29 crc kubenswrapper[4823]: I1216 07:56:29.683939 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:29 crc kubenswrapper[4823]: I1216 07:56:29.684374 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:29 crc kubenswrapper[4823]: I1216 07:56:29.723830 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:29 crc kubenswrapper[4823]: I1216 07:56:29.772486 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:56:29 crc kubenswrapper[4823]: E1216 07:56:29.772710 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:56:30 crc kubenswrapper[4823]: I1216 07:56:30.743185 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:30 crc kubenswrapper[4823]: I1216 07:56:30.794290 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wprkf"] Dec 16 07:56:32 crc kubenswrapper[4823]: I1216 07:56:32.700258 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wprkf" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="registry-server" containerID="cri-o://8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12" gracePeriod=2 Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.581478 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.626520 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlqxr\" (UniqueName: \"kubernetes.io/projected/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-kube-api-access-zlqxr\") pod \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.626735 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-utilities\") pod \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.626817 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-catalog-content\") pod \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\" (UID: \"d091ce04-ccb0-448b-b7a8-5139d7ad12b6\") " Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.628904 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-utilities" (OuterVolumeSpecName: "utilities") pod "d091ce04-ccb0-448b-b7a8-5139d7ad12b6" (UID: "d091ce04-ccb0-448b-b7a8-5139d7ad12b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.635180 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-kube-api-access-zlqxr" (OuterVolumeSpecName: "kube-api-access-zlqxr") pod "d091ce04-ccb0-448b-b7a8-5139d7ad12b6" (UID: "d091ce04-ccb0-448b-b7a8-5139d7ad12b6"). InnerVolumeSpecName "kube-api-access-zlqxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.718420 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d091ce04-ccb0-448b-b7a8-5139d7ad12b6" (UID: "d091ce04-ccb0-448b-b7a8-5139d7ad12b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.725077 4823 generic.go:334] "Generic (PLEG): container finished" podID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerID="8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12" exitCode=0 Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.725130 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerDied","Data":"8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12"} Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.725169 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wprkf" event={"ID":"d091ce04-ccb0-448b-b7a8-5139d7ad12b6","Type":"ContainerDied","Data":"7995aa783d0b0ef964abd88aaeca9016cbf4264f98ca91b50b64819254901a03"} Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.725192 4823 scope.go:117] "RemoveContainer" containerID="8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.725233 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wprkf" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.728881 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.729282 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlqxr\" (UniqueName: \"kubernetes.io/projected/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-kube-api-access-zlqxr\") on node \"crc\" DevicePath \"\"" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.729487 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091ce04-ccb0-448b-b7a8-5139d7ad12b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.758521 4823 scope.go:117] "RemoveContainer" containerID="0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.797755 4823 scope.go:117] "RemoveContainer" containerID="de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.798531 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wprkf"] Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.807125 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wprkf"] Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.826511 4823 scope.go:117] "RemoveContainer" containerID="8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12" Dec 16 07:56:33 crc kubenswrapper[4823]: E1216 07:56:33.826886 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12\": container with ID starting with 8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12 not found: ID does not exist" containerID="8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.826925 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12"} err="failed to get container status \"8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12\": rpc error: code = NotFound desc = could not find container \"8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12\": container with ID starting with 8e3592e108d768aef623256c82879d1257b4f3df6e1b841879c8ccbb21ed8e12 not found: ID does not exist" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.826946 4823 scope.go:117] "RemoveContainer" containerID="0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96" Dec 16 07:56:33 crc kubenswrapper[4823]: E1216 07:56:33.827193 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96\": container with ID starting with 0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96 not found: ID does not exist" containerID="0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.827260 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96"} err="failed to get container status \"0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96\": rpc error: code = NotFound desc = could not find container \"0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96\": container with ID starting with 0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96 not found: ID does not exist" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.827316 4823 scope.go:117] "RemoveContainer" containerID="de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c" Dec 16 07:56:33 crc kubenswrapper[4823]: E1216 07:56:33.827606 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c\": container with ID starting with de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c not found: ID does not exist" containerID="de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c" Dec 16 07:56:33 crc kubenswrapper[4823]: I1216 07:56:33.827626 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c"} err="failed to get container status \"de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c\": rpc error: code = NotFound desc = could not find container \"de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c\": container with ID starting with de3412b8507c1e27cebb36414da55e989ae95278cf91cf903e875e355f17ca5c not found: ID does not exist" Dec 16 07:56:35 crc kubenswrapper[4823]: I1216 07:56:35.786957 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" path="/var/lib/kubelet/pods/d091ce04-ccb0-448b-b7a8-5139d7ad12b6/volumes" Dec 16 07:56:36 crc kubenswrapper[4823]: E1216 07:56:36.059238 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:56:41 crc kubenswrapper[4823]: I1216 07:56:41.780977 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:56:41 crc kubenswrapper[4823]: E1216 07:56:41.781946 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:56:46 crc kubenswrapper[4823]: E1216 07:56:46.286586 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:56:55 crc kubenswrapper[4823]: I1216 07:56:55.771736 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:56:55 crc kubenswrapper[4823]: E1216 07:56:55.772348 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:56:56 crc kubenswrapper[4823]: E1216 07:56:56.489191 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:57:06 crc kubenswrapper[4823]: E1216 07:57:06.746521 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:57:08 crc kubenswrapper[4823]: I1216 07:57:08.771917 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:57:08 crc kubenswrapper[4823]: E1216 07:57:08.772702 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:57:16 crc kubenswrapper[4823]: E1216 07:57:16.919459 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd091ce04_ccb0_448b_b7a8_5139d7ad12b6.slice/crio-0b4d53ee68458c433002a9585fc86fe38f05a9551f9632082f17bd7770791e96.scope\": RecentStats: unable to find data in memory cache]" Dec 16 07:57:20 crc kubenswrapper[4823]: I1216 07:57:20.770999 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:57:20 crc kubenswrapper[4823]: E1216 07:57:20.771542 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:57:21 crc kubenswrapper[4823]: E1216 07:57:21.802260 4823 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/5213e07dbd59e51462aec65427bd9d4287b3e29020e4a8319fa0a57bcc82bf94/diff" to get inode usage: stat /var/lib/containers/storage/overlay/5213e07dbd59e51462aec65427bd9d4287b3e29020e4a8319fa0a57bcc82bf94/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openshift-marketplace_community-operators-wprkf_d091ce04-ccb0-448b-b7a8-5139d7ad12b6/extract-content/0.log" to get inode usage: stat /var/log/pods/openshift-marketplace_community-operators-wprkf_d091ce04-ccb0-448b-b7a8-5139d7ad12b6/extract-content/0.log: no such file or directory Dec 16 07:57:34 crc kubenswrapper[4823]: I1216 07:57:34.771702 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:57:34 crc kubenswrapper[4823]: E1216 07:57:34.772735 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:57:46 crc kubenswrapper[4823]: I1216 07:57:46.771897 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:57:46 crc kubenswrapper[4823]: E1216 07:57:46.772741 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:58:00 crc kubenswrapper[4823]: I1216 07:58:00.772424 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:58:00 crc kubenswrapper[4823]: E1216 07:58:00.773309 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:58:11 crc kubenswrapper[4823]: I1216 07:58:11.778507 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:58:11 crc kubenswrapper[4823]: E1216 07:58:11.779466 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:58:23 crc kubenswrapper[4823]: I1216 07:58:23.772275 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:58:23 crc kubenswrapper[4823]: E1216 07:58:23.773218 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:58:35 crc kubenswrapper[4823]: I1216 07:58:35.772578 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:58:35 crc kubenswrapper[4823]: E1216 07:58:35.773362 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:58:41 crc kubenswrapper[4823]: I1216 07:58:41.045305 4823 scope.go:117] "RemoveContainer" containerID="1dd3283c784ec5cd6ff1f67918745f056022f2f5c6f572982d0be9a0148f6d41" Dec 16 07:58:41 crc kubenswrapper[4823]: I1216 07:58:41.075667 4823 scope.go:117] "RemoveContainer" containerID="418869e909a5bb32fefcf65800278ffb791024d89a95d281e820bf4c3fa0e40d" Dec 16 07:58:41 crc kubenswrapper[4823]: I1216 07:58:41.097914 4823 scope.go:117] "RemoveContainer" containerID="370a079cba701c88c6b8cecc56bc99c61e47c8b7c9b4c90db19f3c49511f541f" Dec 16 07:58:50 crc kubenswrapper[4823]: I1216 07:58:50.772009 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:58:50 crc kubenswrapper[4823]: E1216 07:58:50.772826 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:59:02 crc kubenswrapper[4823]: I1216 07:59:02.772210 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:59:02 crc kubenswrapper[4823]: E1216 07:59:02.773295 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:59:17 crc kubenswrapper[4823]: I1216 07:59:17.771823 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:59:17 crc kubenswrapper[4823]: E1216 07:59:17.772643 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:59:32 crc kubenswrapper[4823]: I1216 07:59:32.772281 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:59:32 crc kubenswrapper[4823]: E1216 07:59:32.772886 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:59:46 crc kubenswrapper[4823]: I1216 07:59:46.772237 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:59:46 crc kubenswrapper[4823]: E1216 07:59:46.772883 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 07:59:59 crc kubenswrapper[4823]: I1216 07:59:59.772128 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 07:59:59 crc kubenswrapper[4823]: E1216 07:59:59.773132 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.191005 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz"] Dec 16 08:00:00 crc kubenswrapper[4823]: E1216 08:00:00.191681 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="extract-content" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.191705 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="extract-content" Dec 16 08:00:00 crc kubenswrapper[4823]: E1216 08:00:00.191731 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="registry-server" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.191740 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="registry-server" Dec 16 08:00:00 crc kubenswrapper[4823]: E1216 08:00:00.191759 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="extract-utilities" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.191766 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="extract-utilities" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.191944 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d091ce04-ccb0-448b-b7a8-5139d7ad12b6" containerName="registry-server" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.192608 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.199233 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.199383 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.209657 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz"] Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.230377 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s654\" (UniqueName: \"kubernetes.io/projected/e97bd637-59e2-4dfb-9935-6a84f4e46388-kube-api-access-8s654\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.230458 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e97bd637-59e2-4dfb-9935-6a84f4e46388-config-volume\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.230489 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e97bd637-59e2-4dfb-9935-6a84f4e46388-secret-volume\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.331371 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s654\" (UniqueName: \"kubernetes.io/projected/e97bd637-59e2-4dfb-9935-6a84f4e46388-kube-api-access-8s654\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.331739 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e97bd637-59e2-4dfb-9935-6a84f4e46388-config-volume\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.331866 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e97bd637-59e2-4dfb-9935-6a84f4e46388-secret-volume\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.333954 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e97bd637-59e2-4dfb-9935-6a84f4e46388-config-volume\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.348177 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e97bd637-59e2-4dfb-9935-6a84f4e46388-secret-volume\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.359092 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s654\" (UniqueName: \"kubernetes.io/projected/e97bd637-59e2-4dfb-9935-6a84f4e46388-kube-api-access-8s654\") pod \"collect-profiles-29431200-ctpzz\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:00 crc kubenswrapper[4823]: I1216 08:00:00.518575 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:01 crc kubenswrapper[4823]: I1216 08:00:01.062761 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz"] Dec 16 08:00:01 crc kubenswrapper[4823]: I1216 08:00:01.398484 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" event={"ID":"e97bd637-59e2-4dfb-9935-6a84f4e46388","Type":"ContainerStarted","Data":"b25c945c7478d3ce67781f66e14be73a8abf6e776a23d0b8d03a34c19bfdd70c"} Dec 16 08:00:01 crc kubenswrapper[4823]: I1216 08:00:01.398707 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" event={"ID":"e97bd637-59e2-4dfb-9935-6a84f4e46388","Type":"ContainerStarted","Data":"45bc3c9a26fa1f40f1126d0f3d07e92f0f9c6bb768a3d520e61ce827cd180ff2"} Dec 16 08:00:02 crc kubenswrapper[4823]: I1216 08:00:02.412188 4823 generic.go:334] "Generic (PLEG): container finished" podID="e97bd637-59e2-4dfb-9935-6a84f4e46388" containerID="b25c945c7478d3ce67781f66e14be73a8abf6e776a23d0b8d03a34c19bfdd70c" exitCode=0 Dec 16 08:00:02 crc kubenswrapper[4823]: I1216 08:00:02.412246 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" event={"ID":"e97bd637-59e2-4dfb-9935-6a84f4e46388","Type":"ContainerDied","Data":"b25c945c7478d3ce67781f66e14be73a8abf6e776a23d0b8d03a34c19bfdd70c"} Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.942786 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.992175 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e97bd637-59e2-4dfb-9935-6a84f4e46388-config-volume\") pod \"e97bd637-59e2-4dfb-9935-6a84f4e46388\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.992246 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e97bd637-59e2-4dfb-9935-6a84f4e46388-secret-volume\") pod \"e97bd637-59e2-4dfb-9935-6a84f4e46388\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.992326 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s654\" (UniqueName: \"kubernetes.io/projected/e97bd637-59e2-4dfb-9935-6a84f4e46388-kube-api-access-8s654\") pod \"e97bd637-59e2-4dfb-9935-6a84f4e46388\" (UID: \"e97bd637-59e2-4dfb-9935-6a84f4e46388\") " Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.992767 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97bd637-59e2-4dfb-9935-6a84f4e46388-config-volume" (OuterVolumeSpecName: "config-volume") pod "e97bd637-59e2-4dfb-9935-6a84f4e46388" (UID: "e97bd637-59e2-4dfb-9935-6a84f4e46388"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.996967 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97bd637-59e2-4dfb-9935-6a84f4e46388-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e97bd637-59e2-4dfb-9935-6a84f4e46388" (UID: "e97bd637-59e2-4dfb-9935-6a84f4e46388"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:00:03 crc kubenswrapper[4823]: I1216 08:00:03.997125 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97bd637-59e2-4dfb-9935-6a84f4e46388-kube-api-access-8s654" (OuterVolumeSpecName: "kube-api-access-8s654") pod "e97bd637-59e2-4dfb-9935-6a84f4e46388" (UID: "e97bd637-59e2-4dfb-9935-6a84f4e46388"). InnerVolumeSpecName "kube-api-access-8s654". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.093606 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e97bd637-59e2-4dfb-9935-6a84f4e46388-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.093650 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e97bd637-59e2-4dfb-9935-6a84f4e46388-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.093660 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s654\" (UniqueName: \"kubernetes.io/projected/e97bd637-59e2-4dfb-9935-6a84f4e46388-kube-api-access-8s654\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.430607 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" event={"ID":"e97bd637-59e2-4dfb-9935-6a84f4e46388","Type":"ContainerDied","Data":"45bc3c9a26fa1f40f1126d0f3d07e92f0f9c6bb768a3d520e61ce827cd180ff2"} Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.430644 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45bc3c9a26fa1f40f1126d0f3d07e92f0f9c6bb768a3d520e61ce827cd180ff2" Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.430718 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz" Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.532767 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92"] Dec 16 08:00:04 crc kubenswrapper[4823]: I1216 08:00:04.541781 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-4qg92"] Dec 16 08:00:05 crc kubenswrapper[4823]: I1216 08:00:05.792936 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6" path="/var/lib/kubelet/pods/1bb3b1a1-ff5f-4119-8b6e-4ccc834a34d6/volumes" Dec 16 08:00:13 crc kubenswrapper[4823]: I1216 08:00:13.773379 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 08:00:13 crc kubenswrapper[4823]: E1216 08:00:13.774457 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:00:26 crc kubenswrapper[4823]: I1216 08:00:26.772294 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 08:00:26 crc kubenswrapper[4823]: E1216 08:00:26.773211 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:00:40 crc kubenswrapper[4823]: I1216 08:00:40.771584 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 08:00:41 crc kubenswrapper[4823]: I1216 08:00:41.175479 4823 scope.go:117] "RemoveContainer" containerID="d29286bc202b75173f2d2104bec333caae3792ccb51715cb7c891d3c31f531ec" Dec 16 08:00:41 crc kubenswrapper[4823]: I1216 08:00:41.750084 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"a801a5c1b5b7b42ebcabc28cc8ce824cde3775f622855cb462e4b7c503edfe83"} Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.442895 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4td8f"] Dec 16 08:02:08 crc kubenswrapper[4823]: E1216 08:02:08.444147 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97bd637-59e2-4dfb-9935-6a84f4e46388" containerName="collect-profiles" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.444167 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97bd637-59e2-4dfb-9935-6a84f4e46388" containerName="collect-profiles" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.444334 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97bd637-59e2-4dfb-9935-6a84f4e46388" containerName="collect-profiles" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.445788 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.457972 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4td8f"] Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.572873 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckg8\" (UniqueName: \"kubernetes.io/projected/7f6bcd3b-b8bd-40f9-98e9-012863d00505-kube-api-access-rckg8\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.573278 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-catalog-content\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.573381 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-utilities\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.675014 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-utilities\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.675102 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckg8\" (UniqueName: \"kubernetes.io/projected/7f6bcd3b-b8bd-40f9-98e9-012863d00505-kube-api-access-rckg8\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.675160 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-catalog-content\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.675806 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-utilities\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.675891 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-catalog-content\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.699998 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckg8\" (UniqueName: \"kubernetes.io/projected/7f6bcd3b-b8bd-40f9-98e9-012863d00505-kube-api-access-rckg8\") pod \"redhat-operators-4td8f\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:08 crc kubenswrapper[4823]: I1216 08:02:08.766815 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:09 crc kubenswrapper[4823]: I1216 08:02:09.718773 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4td8f"] Dec 16 08:02:10 crc kubenswrapper[4823]: I1216 08:02:10.646088 4823 generic.go:334] "Generic (PLEG): container finished" podID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerID="d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b" exitCode=0 Dec 16 08:02:10 crc kubenswrapper[4823]: I1216 08:02:10.646214 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerDied","Data":"d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b"} Dec 16 08:02:10 crc kubenswrapper[4823]: I1216 08:02:10.646388 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerStarted","Data":"81fe4b8296d864350991a6d07c792e1e67e38fa7989da01844b7b73e773ff637"} Dec 16 08:02:10 crc kubenswrapper[4823]: I1216 08:02:10.649647 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:02:13 crc kubenswrapper[4823]: I1216 08:02:13.675130 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerStarted","Data":"3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a"} Dec 16 08:02:14 crc kubenswrapper[4823]: I1216 08:02:14.687609 4823 generic.go:334] "Generic (PLEG): container finished" podID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerID="3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a" exitCode=0 Dec 16 08:02:14 crc kubenswrapper[4823]: I1216 08:02:14.687692 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerDied","Data":"3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a"} Dec 16 08:02:15 crc kubenswrapper[4823]: I1216 08:02:15.695417 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerStarted","Data":"c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171"} Dec 16 08:02:15 crc kubenswrapper[4823]: I1216 08:02:15.718510 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4td8f" podStartSLOduration=3.161204979 podStartE2EDuration="7.71849329s" podCreationTimestamp="2025-12-16 08:02:08 +0000 UTC" firstStartedPulling="2025-12-16 08:02:10.649160737 +0000 UTC m=+4009.137726870" lastFinishedPulling="2025-12-16 08:02:15.206449058 +0000 UTC m=+4013.695015181" observedRunningTime="2025-12-16 08:02:15.712694768 +0000 UTC m=+4014.201260901" watchObservedRunningTime="2025-12-16 08:02:15.71849329 +0000 UTC m=+4014.207059413" Dec 16 08:02:18 crc kubenswrapper[4823]: I1216 08:02:18.768017 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:18 crc kubenswrapper[4823]: I1216 08:02:18.769133 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:19 crc kubenswrapper[4823]: I1216 08:02:19.814984 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4td8f" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="registry-server" probeResult="failure" output=< Dec 16 08:02:19 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 08:02:19 crc kubenswrapper[4823]: > Dec 16 08:02:28 crc kubenswrapper[4823]: I1216 08:02:28.810451 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:28 crc kubenswrapper[4823]: I1216 08:02:28.847764 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:29 crc kubenswrapper[4823]: I1216 08:02:29.048627 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4td8f"] Dec 16 08:02:30 crc kubenswrapper[4823]: I1216 08:02:30.844359 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4td8f" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="registry-server" containerID="cri-o://c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171" gracePeriod=2 Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.474365 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.593974 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-utilities\") pod \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.594015 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-catalog-content\") pod \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.594108 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rckg8\" (UniqueName: \"kubernetes.io/projected/7f6bcd3b-b8bd-40f9-98e9-012863d00505-kube-api-access-rckg8\") pod \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\" (UID: \"7f6bcd3b-b8bd-40f9-98e9-012863d00505\") " Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.596522 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-utilities" (OuterVolumeSpecName: "utilities") pod "7f6bcd3b-b8bd-40f9-98e9-012863d00505" (UID: "7f6bcd3b-b8bd-40f9-98e9-012863d00505"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.603412 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.607425 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6bcd3b-b8bd-40f9-98e9-012863d00505-kube-api-access-rckg8" (OuterVolumeSpecName: "kube-api-access-rckg8") pod "7f6bcd3b-b8bd-40f9-98e9-012863d00505" (UID: "7f6bcd3b-b8bd-40f9-98e9-012863d00505"). InnerVolumeSpecName "kube-api-access-rckg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.705136 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rckg8\" (UniqueName: \"kubernetes.io/projected/7f6bcd3b-b8bd-40f9-98e9-012863d00505-kube-api-access-rckg8\") on node \"crc\" DevicePath \"\"" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.711138 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f6bcd3b-b8bd-40f9-98e9-012863d00505" (UID: "7f6bcd3b-b8bd-40f9-98e9-012863d00505"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.806803 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6bcd3b-b8bd-40f9-98e9-012863d00505-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.863659 4823 generic.go:334] "Generic (PLEG): container finished" podID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerID="c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171" exitCode=0 Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.863743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerDied","Data":"c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171"} Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.863758 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4td8f" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.863775 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4td8f" event={"ID":"7f6bcd3b-b8bd-40f9-98e9-012863d00505","Type":"ContainerDied","Data":"81fe4b8296d864350991a6d07c792e1e67e38fa7989da01844b7b73e773ff637"} Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.863791 4823 scope.go:117] "RemoveContainer" containerID="c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.892154 4823 scope.go:117] "RemoveContainer" containerID="3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.914973 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4td8f"] Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.921630 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4td8f"] Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.923531 4823 scope.go:117] "RemoveContainer" containerID="d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.947233 4823 scope.go:117] "RemoveContainer" containerID="c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171" Dec 16 08:02:32 crc kubenswrapper[4823]: E1216 08:02:32.947890 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171\": container with ID starting with c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171 not found: ID does not exist" containerID="c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.947930 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171"} err="failed to get container status \"c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171\": rpc error: code = NotFound desc = could not find container \"c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171\": container with ID starting with c3c4b33461fed30965781fd2a7f5b88fda088295b11790bddaf0f722c391e171 not found: ID does not exist" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.947951 4823 scope.go:117] "RemoveContainer" containerID="3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a" Dec 16 08:02:32 crc kubenswrapper[4823]: E1216 08:02:32.948389 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a\": container with ID starting with 3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a not found: ID does not exist" containerID="3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.948436 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a"} err="failed to get container status \"3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a\": rpc error: code = NotFound desc = could not find container \"3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a\": container with ID starting with 3a43e8f8a323d9bc8b802d565ae29adfbf9f4927d3294d87287f6819edbc5e9a not found: ID does not exist" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.948450 4823 scope.go:117] "RemoveContainer" containerID="d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b" Dec 16 08:02:32 crc kubenswrapper[4823]: E1216 08:02:32.948684 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b\": container with ID starting with d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b not found: ID does not exist" containerID="d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b" Dec 16 08:02:32 crc kubenswrapper[4823]: I1216 08:02:32.948706 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b"} err="failed to get container status \"d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b\": rpc error: code = NotFound desc = could not find container \"d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b\": container with ID starting with d4cdc65f186cb7b87897e670efe24414c73b5da23c4c0db36b9025c98fa2447b not found: ID does not exist" Dec 16 08:02:33 crc kubenswrapper[4823]: I1216 08:02:33.783232 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" path="/var/lib/kubelet/pods/7f6bcd3b-b8bd-40f9-98e9-012863d00505/volumes" Dec 16 08:02:58 crc kubenswrapper[4823]: I1216 08:02:58.134175 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:02:58 crc kubenswrapper[4823]: I1216 08:02:58.134878 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.623472 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ggjl7"] Dec 16 08:03:23 crc kubenswrapper[4823]: E1216 08:03:23.624644 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="extract-utilities" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.624668 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="extract-utilities" Dec 16 08:03:23 crc kubenswrapper[4823]: E1216 08:03:23.624693 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="registry-server" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.624704 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="registry-server" Dec 16 08:03:23 crc kubenswrapper[4823]: E1216 08:03:23.624727 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="extract-content" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.624739 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="extract-content" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.624999 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6bcd3b-b8bd-40f9-98e9-012863d00505" containerName="registry-server" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.626622 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.643284 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggjl7"] Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.784823 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-utilities\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.785148 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-catalog-content\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.785234 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plzqv\" (UniqueName: \"kubernetes.io/projected/af469ad2-0947-4f6b-b27f-dc00d4f86407-kube-api-access-plzqv\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.886914 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plzqv\" (UniqueName: \"kubernetes.io/projected/af469ad2-0947-4f6b-b27f-dc00d4f86407-kube-api-access-plzqv\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.887101 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-utilities\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.887177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-catalog-content\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.887647 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-catalog-content\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.887772 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-utilities\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.924230 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plzqv\" (UniqueName: \"kubernetes.io/projected/af469ad2-0947-4f6b-b27f-dc00d4f86407-kube-api-access-plzqv\") pod \"redhat-marketplace-ggjl7\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:23 crc kubenswrapper[4823]: I1216 08:03:23.947703 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:24 crc kubenswrapper[4823]: I1216 08:03:24.381519 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggjl7"] Dec 16 08:03:25 crc kubenswrapper[4823]: I1216 08:03:25.247585 4823 generic.go:334] "Generic (PLEG): container finished" podID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerID="a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9" exitCode=0 Dec 16 08:03:25 crc kubenswrapper[4823]: I1216 08:03:25.247642 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerDied","Data":"a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9"} Dec 16 08:03:25 crc kubenswrapper[4823]: I1216 08:03:25.247872 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerStarted","Data":"e514ce8be436bdcdfe182b35e07e65c48e9e75d760734733603007d90434a6a9"} Dec 16 08:03:26 crc kubenswrapper[4823]: I1216 08:03:26.256062 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerStarted","Data":"0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb"} Dec 16 08:03:27 crc kubenswrapper[4823]: I1216 08:03:27.265119 4823 generic.go:334] "Generic (PLEG): container finished" podID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerID="0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb" exitCode=0 Dec 16 08:03:27 crc kubenswrapper[4823]: I1216 08:03:27.265200 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerDied","Data":"0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb"} Dec 16 08:03:28 crc kubenswrapper[4823]: I1216 08:03:28.134566 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:03:28 crc kubenswrapper[4823]: I1216 08:03:28.134873 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:03:28 crc kubenswrapper[4823]: I1216 08:03:28.274206 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerStarted","Data":"dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659"} Dec 16 08:03:33 crc kubenswrapper[4823]: I1216 08:03:33.948094 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:33 crc kubenswrapper[4823]: I1216 08:03:33.948900 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:33 crc kubenswrapper[4823]: I1216 08:03:33.995778 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:34 crc kubenswrapper[4823]: I1216 08:03:34.018973 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ggjl7" podStartSLOduration=8.394682125 podStartE2EDuration="11.018953115s" podCreationTimestamp="2025-12-16 08:03:23 +0000 UTC" firstStartedPulling="2025-12-16 08:03:25.253039234 +0000 UTC m=+4083.741605357" lastFinishedPulling="2025-12-16 08:03:27.877310214 +0000 UTC m=+4086.365876347" observedRunningTime="2025-12-16 08:03:28.310534096 +0000 UTC m=+4086.799100219" watchObservedRunningTime="2025-12-16 08:03:34.018953115 +0000 UTC m=+4092.507519238" Dec 16 08:03:34 crc kubenswrapper[4823]: I1216 08:03:34.358740 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:34 crc kubenswrapper[4823]: I1216 08:03:34.417554 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggjl7"] Dec 16 08:03:36 crc kubenswrapper[4823]: I1216 08:03:36.328941 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ggjl7" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="registry-server" containerID="cri-o://dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659" gracePeriod=2 Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.252435 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.338472 4823 generic.go:334] "Generic (PLEG): container finished" podID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerID="dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659" exitCode=0 Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.338520 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerDied","Data":"dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659"} Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.338548 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggjl7" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.338570 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggjl7" event={"ID":"af469ad2-0947-4f6b-b27f-dc00d4f86407","Type":"ContainerDied","Data":"e514ce8be436bdcdfe182b35e07e65c48e9e75d760734733603007d90434a6a9"} Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.338594 4823 scope.go:117] "RemoveContainer" containerID="dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.356676 4823 scope.go:117] "RemoveContainer" containerID="0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.375559 4823 scope.go:117] "RemoveContainer" containerID="a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.386840 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plzqv\" (UniqueName: \"kubernetes.io/projected/af469ad2-0947-4f6b-b27f-dc00d4f86407-kube-api-access-plzqv\") pod \"af469ad2-0947-4f6b-b27f-dc00d4f86407\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.386901 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-catalog-content\") pod \"af469ad2-0947-4f6b-b27f-dc00d4f86407\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.386939 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-utilities\") pod \"af469ad2-0947-4f6b-b27f-dc00d4f86407\" (UID: \"af469ad2-0947-4f6b-b27f-dc00d4f86407\") " Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.388125 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-utilities" (OuterVolumeSpecName: "utilities") pod "af469ad2-0947-4f6b-b27f-dc00d4f86407" (UID: "af469ad2-0947-4f6b-b27f-dc00d4f86407"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.393302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af469ad2-0947-4f6b-b27f-dc00d4f86407-kube-api-access-plzqv" (OuterVolumeSpecName: "kube-api-access-plzqv") pod "af469ad2-0947-4f6b-b27f-dc00d4f86407" (UID: "af469ad2-0947-4f6b-b27f-dc00d4f86407"). InnerVolumeSpecName "kube-api-access-plzqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.407232 4823 scope.go:117] "RemoveContainer" containerID="dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659" Dec 16 08:03:37 crc kubenswrapper[4823]: E1216 08:03:37.407839 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659\": container with ID starting with dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659 not found: ID does not exist" containerID="dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.407929 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659"} err="failed to get container status \"dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659\": rpc error: code = NotFound desc = could not find container \"dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659\": container with ID starting with dd6c411f321295f633912d8bfdb2145fdd2cabf49920445d257e97ac41a7b659 not found: ID does not exist" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.407966 4823 scope.go:117] "RemoveContainer" containerID="0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb" Dec 16 08:03:37 crc kubenswrapper[4823]: E1216 08:03:37.408724 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb\": container with ID starting with 0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb not found: ID does not exist" containerID="0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.408835 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb"} err="failed to get container status \"0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb\": rpc error: code = NotFound desc = could not find container \"0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb\": container with ID starting with 0ee0b9426186afd681612e8b379fd57a0975e0ca17f046efd8dcadec61d61fbb not found: ID does not exist" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.408921 4823 scope.go:117] "RemoveContainer" containerID="a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9" Dec 16 08:03:37 crc kubenswrapper[4823]: E1216 08:03:37.409584 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9\": container with ID starting with a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9 not found: ID does not exist" containerID="a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.409607 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9"} err="failed to get container status \"a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9\": rpc error: code = NotFound desc = could not find container \"a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9\": container with ID starting with a2a63559930639d0e42069563d1d384e4bd4578be7e96216bebfe86b873251d9 not found: ID does not exist" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.424267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af469ad2-0947-4f6b-b27f-dc00d4f86407" (UID: "af469ad2-0947-4f6b-b27f-dc00d4f86407"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.489514 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plzqv\" (UniqueName: \"kubernetes.io/projected/af469ad2-0947-4f6b-b27f-dc00d4f86407-kube-api-access-plzqv\") on node \"crc\" DevicePath \"\"" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.489561 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.489580 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af469ad2-0947-4f6b-b27f-dc00d4f86407-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.684912 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggjl7"] Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.696105 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggjl7"] Dec 16 08:03:37 crc kubenswrapper[4823]: I1216 08:03:37.786519 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" path="/var/lib/kubelet/pods/af469ad2-0947-4f6b-b27f-dc00d4f86407/volumes" Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.134308 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.134827 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.134896 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.135744 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a801a5c1b5b7b42ebcabc28cc8ce824cde3775f622855cb462e4b7c503edfe83"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.135827 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://a801a5c1b5b7b42ebcabc28cc8ce824cde3775f622855cb462e4b7c503edfe83" gracePeriod=600 Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.495617 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="a801a5c1b5b7b42ebcabc28cc8ce824cde3775f622855cb462e4b7c503edfe83" exitCode=0 Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.495667 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"a801a5c1b5b7b42ebcabc28cc8ce824cde3775f622855cb462e4b7c503edfe83"} Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.496404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92"} Dec 16 08:03:58 crc kubenswrapper[4823]: I1216 08:03:58.496463 4823 scope.go:117] "RemoveContainer" containerID="ce2cc9ad0ea70d189504e80b006c88ca5902658469d84ed52b52b4f88ad839eb" Dec 16 08:05:58 crc kubenswrapper[4823]: I1216 08:05:58.134293 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:05:58 crc kubenswrapper[4823]: I1216 08:05:58.134877 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.206314 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v4wh2"] Dec 16 08:06:08 crc kubenswrapper[4823]: E1216 08:06:08.215465 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="registry-server" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.215518 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="registry-server" Dec 16 08:06:08 crc kubenswrapper[4823]: E1216 08:06:08.215562 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="extract-utilities" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.215575 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="extract-utilities" Dec 16 08:06:08 crc kubenswrapper[4823]: E1216 08:06:08.215598 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="extract-content" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.215607 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="extract-content" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.215827 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="af469ad2-0947-4f6b-b27f-dc00d4f86407" containerName="registry-server" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.217385 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.246145 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4wh2"] Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.375283 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-utilities\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.375348 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-catalog-content\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.375904 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxzh\" (UniqueName: \"kubernetes.io/projected/700acf03-28ce-4c8f-a3b0-587d20183ba8-kube-api-access-xlxzh\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.476980 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-utilities\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.477043 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-catalog-content\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.477135 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxzh\" (UniqueName: \"kubernetes.io/projected/700acf03-28ce-4c8f-a3b0-587d20183ba8-kube-api-access-xlxzh\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.477610 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-catalog-content\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.477638 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-utilities\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.499664 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxzh\" (UniqueName: \"kubernetes.io/projected/700acf03-28ce-4c8f-a3b0-587d20183ba8-kube-api-access-xlxzh\") pod \"certified-operators-v4wh2\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:08 crc kubenswrapper[4823]: I1216 08:06:08.554480 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:09 crc kubenswrapper[4823]: I1216 08:06:09.053385 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4wh2"] Dec 16 08:06:09 crc kubenswrapper[4823]: I1216 08:06:09.536721 4823 generic.go:334] "Generic (PLEG): container finished" podID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerID="37495b6ccc4aa7408ba2b64684ba4a887bb8d85ab175d1c923a5e27ac77e8d08" exitCode=0 Dec 16 08:06:09 crc kubenswrapper[4823]: I1216 08:06:09.536779 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerDied","Data":"37495b6ccc4aa7408ba2b64684ba4a887bb8d85ab175d1c923a5e27ac77e8d08"} Dec 16 08:06:09 crc kubenswrapper[4823]: I1216 08:06:09.536824 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerStarted","Data":"c452c48230f50b0288f381df780417d1de6574e34482016a7b7f1cdfc24da1b0"} Dec 16 08:06:10 crc kubenswrapper[4823]: I1216 08:06:10.546608 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerStarted","Data":"a787d60ec3fcec89a1d6be86cb9fc97ebf6ed5874ab702348943bbde74bcea98"} Dec 16 08:06:11 crc kubenswrapper[4823]: I1216 08:06:11.555582 4823 generic.go:334] "Generic (PLEG): container finished" podID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerID="a787d60ec3fcec89a1d6be86cb9fc97ebf6ed5874ab702348943bbde74bcea98" exitCode=0 Dec 16 08:06:11 crc kubenswrapper[4823]: I1216 08:06:11.555695 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerDied","Data":"a787d60ec3fcec89a1d6be86cb9fc97ebf6ed5874ab702348943bbde74bcea98"} Dec 16 08:06:12 crc kubenswrapper[4823]: I1216 08:06:12.565759 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerStarted","Data":"b46c51e686cb4e81738ad577b698667f062e0dee9369f00fa60e2a7be118c7d2"} Dec 16 08:06:18 crc kubenswrapper[4823]: I1216 08:06:18.556423 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:18 crc kubenswrapper[4823]: I1216 08:06:18.557055 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:18 crc kubenswrapper[4823]: I1216 08:06:18.612610 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:18 crc kubenswrapper[4823]: I1216 08:06:18.647640 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v4wh2" podStartSLOduration=8.194573601 podStartE2EDuration="10.647609474s" podCreationTimestamp="2025-12-16 08:06:08 +0000 UTC" firstStartedPulling="2025-12-16 08:06:09.538314557 +0000 UTC m=+4248.026880680" lastFinishedPulling="2025-12-16 08:06:11.99135042 +0000 UTC m=+4250.479916553" observedRunningTime="2025-12-16 08:06:12.588525001 +0000 UTC m=+4251.077091124" watchObservedRunningTime="2025-12-16 08:06:18.647609474 +0000 UTC m=+4257.136175597" Dec 16 08:06:18 crc kubenswrapper[4823]: I1216 08:06:18.669241 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:18 crc kubenswrapper[4823]: I1216 08:06:18.866876 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4wh2"] Dec 16 08:06:20 crc kubenswrapper[4823]: I1216 08:06:20.625991 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v4wh2" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="registry-server" containerID="cri-o://b46c51e686cb4e81738ad577b698667f062e0dee9369f00fa60e2a7be118c7d2" gracePeriod=2 Dec 16 08:06:21 crc kubenswrapper[4823]: I1216 08:06:21.633688 4823 generic.go:334] "Generic (PLEG): container finished" podID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerID="b46c51e686cb4e81738ad577b698667f062e0dee9369f00fa60e2a7be118c7d2" exitCode=0 Dec 16 08:06:21 crc kubenswrapper[4823]: I1216 08:06:21.633762 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerDied","Data":"b46c51e686cb4e81738ad577b698667f062e0dee9369f00fa60e2a7be118c7d2"} Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.258222 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.396182 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlxzh\" (UniqueName: \"kubernetes.io/projected/700acf03-28ce-4c8f-a3b0-587d20183ba8-kube-api-access-xlxzh\") pod \"700acf03-28ce-4c8f-a3b0-587d20183ba8\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.396559 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-catalog-content\") pod \"700acf03-28ce-4c8f-a3b0-587d20183ba8\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.396598 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-utilities\") pod \"700acf03-28ce-4c8f-a3b0-587d20183ba8\" (UID: \"700acf03-28ce-4c8f-a3b0-587d20183ba8\") " Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.397527 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-utilities" (OuterVolumeSpecName: "utilities") pod "700acf03-28ce-4c8f-a3b0-587d20183ba8" (UID: "700acf03-28ce-4c8f-a3b0-587d20183ba8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.448299 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "700acf03-28ce-4c8f-a3b0-587d20183ba8" (UID: "700acf03-28ce-4c8f-a3b0-587d20183ba8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.498066 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.498118 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700acf03-28ce-4c8f-a3b0-587d20183ba8-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.643119 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4wh2" event={"ID":"700acf03-28ce-4c8f-a3b0-587d20183ba8","Type":"ContainerDied","Data":"c452c48230f50b0288f381df780417d1de6574e34482016a7b7f1cdfc24da1b0"} Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.643173 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4wh2" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.643180 4823 scope.go:117] "RemoveContainer" containerID="b46c51e686cb4e81738ad577b698667f062e0dee9369f00fa60e2a7be118c7d2" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.667032 4823 scope.go:117] "RemoveContainer" containerID="a787d60ec3fcec89a1d6be86cb9fc97ebf6ed5874ab702348943bbde74bcea98" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.681957 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700acf03-28ce-4c8f-a3b0-587d20183ba8-kube-api-access-xlxzh" (OuterVolumeSpecName: "kube-api-access-xlxzh") pod "700acf03-28ce-4c8f-a3b0-587d20183ba8" (UID: "700acf03-28ce-4c8f-a3b0-587d20183ba8"). InnerVolumeSpecName "kube-api-access-xlxzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.700980 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlxzh\" (UniqueName: \"kubernetes.io/projected/700acf03-28ce-4c8f-a3b0-587d20183ba8-kube-api-access-xlxzh\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.793767 4823 scope.go:117] "RemoveContainer" containerID="37495b6ccc4aa7408ba2b64684ba4a887bb8d85ab175d1c923a5e27ac77e8d08" Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.978700 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4wh2"] Dec 16 08:06:22 crc kubenswrapper[4823]: I1216 08:06:22.984243 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v4wh2"] Dec 16 08:06:23 crc kubenswrapper[4823]: I1216 08:06:23.781362 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" path="/var/lib/kubelet/pods/700acf03-28ce-4c8f-a3b0-587d20183ba8/volumes" Dec 16 08:06:28 crc kubenswrapper[4823]: I1216 08:06:28.134052 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:06:28 crc kubenswrapper[4823]: I1216 08:06:28.134537 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.561409 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l9bhc"] Dec 16 08:06:45 crc kubenswrapper[4823]: E1216 08:06:45.562575 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="extract-content" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.562595 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="extract-content" Dec 16 08:06:45 crc kubenswrapper[4823]: E1216 08:06:45.562607 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="registry-server" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.562620 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="registry-server" Dec 16 08:06:45 crc kubenswrapper[4823]: E1216 08:06:45.562645 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="extract-utilities" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.562652 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="extract-utilities" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.577448 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="700acf03-28ce-4c8f-a3b0-587d20183ba8" containerName="registry-server" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.588468 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.593280 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9bhc"] Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.633350 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-utilities\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.633529 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-catalog-content\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.633689 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hsg\" (UniqueName: \"kubernetes.io/projected/60ef728e-9c9c-44af-94cf-4fa04a6ef263-kube-api-access-p4hsg\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.734938 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-catalog-content\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.735008 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hsg\" (UniqueName: \"kubernetes.io/projected/60ef728e-9c9c-44af-94cf-4fa04a6ef263-kube-api-access-p4hsg\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.735059 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-utilities\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.735459 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-catalog-content\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.735552 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-utilities\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.759599 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hsg\" (UniqueName: \"kubernetes.io/projected/60ef728e-9c9c-44af-94cf-4fa04a6ef263-kube-api-access-p4hsg\") pod \"community-operators-l9bhc\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:45 crc kubenswrapper[4823]: I1216 08:06:45.912281 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:46 crc kubenswrapper[4823]: I1216 08:06:46.403775 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9bhc"] Dec 16 08:06:46 crc kubenswrapper[4823]: I1216 08:06:46.808231 4823 generic.go:334] "Generic (PLEG): container finished" podID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerID="3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89" exitCode=0 Dec 16 08:06:46 crc kubenswrapper[4823]: I1216 08:06:46.808318 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerDied","Data":"3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89"} Dec 16 08:06:46 crc kubenswrapper[4823]: I1216 08:06:46.808692 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerStarted","Data":"d14a97c6dd158959151be7dc158764b783bf6fd926e7a383d527fe263d578395"} Dec 16 08:06:47 crc kubenswrapper[4823]: I1216 08:06:47.817977 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerStarted","Data":"ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1"} Dec 16 08:06:48 crc kubenswrapper[4823]: I1216 08:06:48.828014 4823 generic.go:334] "Generic (PLEG): container finished" podID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerID="ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1" exitCode=0 Dec 16 08:06:48 crc kubenswrapper[4823]: I1216 08:06:48.828120 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerDied","Data":"ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1"} Dec 16 08:06:49 crc kubenswrapper[4823]: I1216 08:06:49.836928 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerStarted","Data":"daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35"} Dec 16 08:06:49 crc kubenswrapper[4823]: I1216 08:06:49.856466 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l9bhc" podStartSLOduration=1.999341609 podStartE2EDuration="4.856445918s" podCreationTimestamp="2025-12-16 08:06:45 +0000 UTC" firstStartedPulling="2025-12-16 08:06:46.810370694 +0000 UTC m=+4285.298936817" lastFinishedPulling="2025-12-16 08:06:49.667475003 +0000 UTC m=+4288.156041126" observedRunningTime="2025-12-16 08:06:49.856049406 +0000 UTC m=+4288.344615539" watchObservedRunningTime="2025-12-16 08:06:49.856445918 +0000 UTC m=+4288.345012051" Dec 16 08:06:55 crc kubenswrapper[4823]: I1216 08:06:55.912860 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:55 crc kubenswrapper[4823]: I1216 08:06:55.913363 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:55 crc kubenswrapper[4823]: I1216 08:06:55.965661 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:56 crc kubenswrapper[4823]: I1216 08:06:56.924983 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:56 crc kubenswrapper[4823]: I1216 08:06:56.983272 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9bhc"] Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.134389 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.134758 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.134809 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.135617 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.135695 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" gracePeriod=600 Dec 16 08:06:58 crc kubenswrapper[4823]: E1216 08:06:58.774399 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.899087 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" exitCode=0 Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.899152 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92"} Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.899220 4823 scope.go:117] "RemoveContainer" containerID="a801a5c1b5b7b42ebcabc28cc8ce824cde3775f622855cb462e4b7c503edfe83" Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.899348 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l9bhc" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="registry-server" containerID="cri-o://daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35" gracePeriod=2 Dec 16 08:06:58 crc kubenswrapper[4823]: I1216 08:06:58.899784 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:06:58 crc kubenswrapper[4823]: E1216 08:06:58.900006 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.250684 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.428923 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4hsg\" (UniqueName: \"kubernetes.io/projected/60ef728e-9c9c-44af-94cf-4fa04a6ef263-kube-api-access-p4hsg\") pod \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.429161 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-utilities\") pod \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.429227 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-catalog-content\") pod \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\" (UID: \"60ef728e-9c9c-44af-94cf-4fa04a6ef263\") " Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.430206 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-utilities" (OuterVolumeSpecName: "utilities") pod "60ef728e-9c9c-44af-94cf-4fa04a6ef263" (UID: "60ef728e-9c9c-44af-94cf-4fa04a6ef263"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.435438 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ef728e-9c9c-44af-94cf-4fa04a6ef263-kube-api-access-p4hsg" (OuterVolumeSpecName: "kube-api-access-p4hsg") pod "60ef728e-9c9c-44af-94cf-4fa04a6ef263" (UID: "60ef728e-9c9c-44af-94cf-4fa04a6ef263"). InnerVolumeSpecName "kube-api-access-p4hsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.494793 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60ef728e-9c9c-44af-94cf-4fa04a6ef263" (UID: "60ef728e-9c9c-44af-94cf-4fa04a6ef263"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.530696 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.530733 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60ef728e-9c9c-44af-94cf-4fa04a6ef263-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.530769 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4hsg\" (UniqueName: \"kubernetes.io/projected/60ef728e-9c9c-44af-94cf-4fa04a6ef263-kube-api-access-p4hsg\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.908671 4823 generic.go:334] "Generic (PLEG): container finished" podID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerID="daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35" exitCode=0 Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.908706 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9bhc" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.908732 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerDied","Data":"daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35"} Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.908836 4823 scope.go:117] "RemoveContainer" containerID="daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.908995 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9bhc" event={"ID":"60ef728e-9c9c-44af-94cf-4fa04a6ef263","Type":"ContainerDied","Data":"d14a97c6dd158959151be7dc158764b783bf6fd926e7a383d527fe263d578395"} Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.930953 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9bhc"] Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.935837 4823 scope.go:117] "RemoveContainer" containerID="ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.953305 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l9bhc"] Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.958256 4823 scope.go:117] "RemoveContainer" containerID="3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.981089 4823 scope.go:117] "RemoveContainer" containerID="daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35" Dec 16 08:06:59 crc kubenswrapper[4823]: E1216 08:06:59.981580 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35\": container with ID starting with daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35 not found: ID does not exist" containerID="daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.981626 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35"} err="failed to get container status \"daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35\": rpc error: code = NotFound desc = could not find container \"daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35\": container with ID starting with daa160a535dc48f9c1414c9f4d8b98052e431dc71efb67d2c93d320a43803e35 not found: ID does not exist" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.981657 4823 scope.go:117] "RemoveContainer" containerID="ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1" Dec 16 08:06:59 crc kubenswrapper[4823]: E1216 08:06:59.982387 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1\": container with ID starting with ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1 not found: ID does not exist" containerID="ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.982435 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1"} err="failed to get container status \"ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1\": rpc error: code = NotFound desc = could not find container \"ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1\": container with ID starting with ab167d6d2d38de7ca7d576ff96b3cce52d85a58059b990b1250c6b8a8d8408e1 not found: ID does not exist" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.982475 4823 scope.go:117] "RemoveContainer" containerID="3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89" Dec 16 08:06:59 crc kubenswrapper[4823]: E1216 08:06:59.982790 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89\": container with ID starting with 3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89 not found: ID does not exist" containerID="3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89" Dec 16 08:06:59 crc kubenswrapper[4823]: I1216 08:06:59.982820 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89"} err="failed to get container status \"3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89\": rpc error: code = NotFound desc = could not find container \"3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89\": container with ID starting with 3c3923508c773955604023baed28774a574d62d50638373233579e16e3273c89 not found: ID does not exist" Dec 16 08:07:01 crc kubenswrapper[4823]: I1216 08:07:01.794647 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" path="/var/lib/kubelet/pods/60ef728e-9c9c-44af-94cf-4fa04a6ef263/volumes" Dec 16 08:07:10 crc kubenswrapper[4823]: I1216 08:07:10.771184 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:07:10 crc kubenswrapper[4823]: E1216 08:07:10.771767 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:07:24 crc kubenswrapper[4823]: I1216 08:07:24.771937 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:07:24 crc kubenswrapper[4823]: E1216 08:07:24.772776 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:07:35 crc kubenswrapper[4823]: I1216 08:07:35.771905 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:07:35 crc kubenswrapper[4823]: E1216 08:07:35.772745 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:07:46 crc kubenswrapper[4823]: I1216 08:07:46.772457 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:07:46 crc kubenswrapper[4823]: E1216 08:07:46.773378 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:07:58 crc kubenswrapper[4823]: I1216 08:07:58.771828 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:07:58 crc kubenswrapper[4823]: E1216 08:07:58.772693 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:08:13 crc kubenswrapper[4823]: I1216 08:08:13.772341 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:08:13 crc kubenswrapper[4823]: E1216 08:08:13.773060 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:08:25 crc kubenswrapper[4823]: I1216 08:08:25.771391 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:08:25 crc kubenswrapper[4823]: E1216 08:08:25.772151 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:08:38 crc kubenswrapper[4823]: I1216 08:08:38.772141 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:08:38 crc kubenswrapper[4823]: E1216 08:08:38.773398 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:08:49 crc kubenswrapper[4823]: I1216 08:08:49.771835 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:08:49 crc kubenswrapper[4823]: E1216 08:08:49.773416 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:09:01 crc kubenswrapper[4823]: I1216 08:09:01.776541 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:09:01 crc kubenswrapper[4823]: E1216 08:09:01.777549 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:09:15 crc kubenswrapper[4823]: I1216 08:09:15.772059 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:09:15 crc kubenswrapper[4823]: E1216 08:09:15.774264 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:09:30 crc kubenswrapper[4823]: I1216 08:09:30.771843 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:09:30 crc kubenswrapper[4823]: E1216 08:09:30.772695 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:09:44 crc kubenswrapper[4823]: I1216 08:09:44.771220 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:09:44 crc kubenswrapper[4823]: E1216 08:09:44.771901 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:09:58 crc kubenswrapper[4823]: I1216 08:09:58.772890 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:09:58 crc kubenswrapper[4823]: E1216 08:09:58.774213 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:10:12 crc kubenswrapper[4823]: I1216 08:10:12.772390 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:10:12 crc kubenswrapper[4823]: E1216 08:10:12.773319 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:10:26 crc kubenswrapper[4823]: I1216 08:10:26.772131 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:10:26 crc kubenswrapper[4823]: E1216 08:10:26.772919 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:10:37 crc kubenswrapper[4823]: I1216 08:10:37.772271 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:10:37 crc kubenswrapper[4823]: E1216 08:10:37.773283 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:10:48 crc kubenswrapper[4823]: I1216 08:10:48.772345 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:10:48 crc kubenswrapper[4823]: E1216 08:10:48.773180 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:11:02 crc kubenswrapper[4823]: I1216 08:11:02.776292 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:11:02 crc kubenswrapper[4823]: E1216 08:11:02.778194 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:11:13 crc kubenswrapper[4823]: I1216 08:11:13.771390 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:11:13 crc kubenswrapper[4823]: E1216 08:11:13.773146 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:11:24 crc kubenswrapper[4823]: I1216 08:11:24.771996 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:11:24 crc kubenswrapper[4823]: E1216 08:11:24.772835 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:11:35 crc kubenswrapper[4823]: I1216 08:11:35.771759 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:11:35 crc kubenswrapper[4823]: E1216 08:11:35.772461 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:11:48 crc kubenswrapper[4823]: I1216 08:11:48.772049 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:11:48 crc kubenswrapper[4823]: E1216 08:11:48.772886 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:12:02 crc kubenswrapper[4823]: I1216 08:12:02.772279 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:12:03 crc kubenswrapper[4823]: I1216 08:12:03.579007 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"77f0a15859e1c465ef458a6baa328e3fa75c551af5fcc90bf59d9cd46ccb3c67"} Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.390183 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-525bw"] Dec 16 08:12:08 crc kubenswrapper[4823]: E1216 08:12:08.391151 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="extract-utilities" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.391168 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="extract-utilities" Dec 16 08:12:08 crc kubenswrapper[4823]: E1216 08:12:08.391183 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="extract-content" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.391190 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="extract-content" Dec 16 08:12:08 crc kubenswrapper[4823]: E1216 08:12:08.391219 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="registry-server" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.391227 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="registry-server" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.391394 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ef728e-9c9c-44af-94cf-4fa04a6ef263" containerName="registry-server" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.392727 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.398327 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-525bw"] Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.495970 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbdx\" (UniqueName: \"kubernetes.io/projected/1274d3e0-24cf-4876-96d6-18fcc1bcf457-kube-api-access-2fbdx\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.496126 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-utilities\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.496160 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-catalog-content\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.597767 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbdx\" (UniqueName: \"kubernetes.io/projected/1274d3e0-24cf-4876-96d6-18fcc1bcf457-kube-api-access-2fbdx\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.597940 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-utilities\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.597980 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-catalog-content\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.598917 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-catalog-content\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.599394 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-utilities\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.632012 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbdx\" (UniqueName: \"kubernetes.io/projected/1274d3e0-24cf-4876-96d6-18fcc1bcf457-kube-api-access-2fbdx\") pod \"redhat-operators-525bw\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:08 crc kubenswrapper[4823]: I1216 08:12:08.711119 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:09 crc kubenswrapper[4823]: I1216 08:12:09.124421 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-525bw"] Dec 16 08:12:09 crc kubenswrapper[4823]: I1216 08:12:09.623523 4823 generic.go:334] "Generic (PLEG): container finished" podID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerID="220989d5b567ac1d14df1bc23ff9e5feff490e41e275a8e0465b9445b5fbed8d" exitCode=0 Dec 16 08:12:09 crc kubenswrapper[4823]: I1216 08:12:09.623575 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525bw" event={"ID":"1274d3e0-24cf-4876-96d6-18fcc1bcf457","Type":"ContainerDied","Data":"220989d5b567ac1d14df1bc23ff9e5feff490e41e275a8e0465b9445b5fbed8d"} Dec 16 08:12:09 crc kubenswrapper[4823]: I1216 08:12:09.623646 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525bw" event={"ID":"1274d3e0-24cf-4876-96d6-18fcc1bcf457","Type":"ContainerStarted","Data":"01608b1d895ea5643ce2527daf52ee658deebbfd3dd0d12c2ce407a3a9a37354"} Dec 16 08:12:09 crc kubenswrapper[4823]: I1216 08:12:09.625887 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:12:11 crc kubenswrapper[4823]: I1216 08:12:11.637663 4823 generic.go:334] "Generic (PLEG): container finished" podID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerID="7483f70bd1c391089bac7d393a08a068994307e6a06235c78b914b9273514637" exitCode=0 Dec 16 08:12:11 crc kubenswrapper[4823]: I1216 08:12:11.637745 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525bw" event={"ID":"1274d3e0-24cf-4876-96d6-18fcc1bcf457","Type":"ContainerDied","Data":"7483f70bd1c391089bac7d393a08a068994307e6a06235c78b914b9273514637"} Dec 16 08:12:12 crc kubenswrapper[4823]: I1216 08:12:12.654525 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525bw" event={"ID":"1274d3e0-24cf-4876-96d6-18fcc1bcf457","Type":"ContainerStarted","Data":"a079fcfe80928d430ac51d505451ec835f9e32c3b93b04ed0a848ba7307769a4"} Dec 16 08:12:12 crc kubenswrapper[4823]: I1216 08:12:12.682530 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-525bw" podStartSLOduration=1.927377715 podStartE2EDuration="4.682514735s" podCreationTimestamp="2025-12-16 08:12:08 +0000 UTC" firstStartedPulling="2025-12-16 08:12:09.625670999 +0000 UTC m=+4608.114237122" lastFinishedPulling="2025-12-16 08:12:12.380807989 +0000 UTC m=+4610.869374142" observedRunningTime="2025-12-16 08:12:12.679594514 +0000 UTC m=+4611.168160647" watchObservedRunningTime="2025-12-16 08:12:12.682514735 +0000 UTC m=+4611.171080858" Dec 16 08:12:18 crc kubenswrapper[4823]: I1216 08:12:18.711674 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:18 crc kubenswrapper[4823]: I1216 08:12:18.711992 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:18 crc kubenswrapper[4823]: I1216 08:12:18.795539 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:19 crc kubenswrapper[4823]: I1216 08:12:19.767607 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:19 crc kubenswrapper[4823]: I1216 08:12:19.835784 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-525bw"] Dec 16 08:12:21 crc kubenswrapper[4823]: I1216 08:12:21.719495 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-525bw" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="registry-server" containerID="cri-o://a079fcfe80928d430ac51d505451ec835f9e32c3b93b04ed0a848ba7307769a4" gracePeriod=2 Dec 16 08:12:23 crc kubenswrapper[4823]: I1216 08:12:23.743920 4823 generic.go:334] "Generic (PLEG): container finished" podID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerID="a079fcfe80928d430ac51d505451ec835f9e32c3b93b04ed0a848ba7307769a4" exitCode=0 Dec 16 08:12:23 crc kubenswrapper[4823]: I1216 08:12:23.744015 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525bw" event={"ID":"1274d3e0-24cf-4876-96d6-18fcc1bcf457","Type":"ContainerDied","Data":"a079fcfe80928d430ac51d505451ec835f9e32c3b93b04ed0a848ba7307769a4"} Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.044506 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.159714 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-catalog-content\") pod \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.162663 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-utilities\") pod \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.163547 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-utilities" (OuterVolumeSpecName: "utilities") pod "1274d3e0-24cf-4876-96d6-18fcc1bcf457" (UID: "1274d3e0-24cf-4876-96d6-18fcc1bcf457"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.163892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fbdx\" (UniqueName: \"kubernetes.io/projected/1274d3e0-24cf-4876-96d6-18fcc1bcf457-kube-api-access-2fbdx\") pod \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\" (UID: \"1274d3e0-24cf-4876-96d6-18fcc1bcf457\") " Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.164628 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.170341 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1274d3e0-24cf-4876-96d6-18fcc1bcf457-kube-api-access-2fbdx" (OuterVolumeSpecName: "kube-api-access-2fbdx") pod "1274d3e0-24cf-4876-96d6-18fcc1bcf457" (UID: "1274d3e0-24cf-4876-96d6-18fcc1bcf457"). InnerVolumeSpecName "kube-api-access-2fbdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.266063 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fbdx\" (UniqueName: \"kubernetes.io/projected/1274d3e0-24cf-4876-96d6-18fcc1bcf457-kube-api-access-2fbdx\") on node \"crc\" DevicePath \"\"" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.301193 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1274d3e0-24cf-4876-96d6-18fcc1bcf457" (UID: "1274d3e0-24cf-4876-96d6-18fcc1bcf457"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.366743 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1274d3e0-24cf-4876-96d6-18fcc1bcf457-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.753583 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-525bw" event={"ID":"1274d3e0-24cf-4876-96d6-18fcc1bcf457","Type":"ContainerDied","Data":"01608b1d895ea5643ce2527daf52ee658deebbfd3dd0d12c2ce407a3a9a37354"} Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.754529 4823 scope.go:117] "RemoveContainer" containerID="a079fcfe80928d430ac51d505451ec835f9e32c3b93b04ed0a848ba7307769a4" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.754787 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-525bw" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.772768 4823 scope.go:117] "RemoveContainer" containerID="7483f70bd1c391089bac7d393a08a068994307e6a06235c78b914b9273514637" Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.857419 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-525bw"] Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.864094 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-525bw"] Dec 16 08:12:24 crc kubenswrapper[4823]: I1216 08:12:24.875267 4823 scope.go:117] "RemoveContainer" containerID="220989d5b567ac1d14df1bc23ff9e5feff490e41e275a8e0465b9445b5fbed8d" Dec 16 08:12:25 crc kubenswrapper[4823]: I1216 08:12:25.782532 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" path="/var/lib/kubelet/pods/1274d3e0-24cf-4876-96d6-18fcc1bcf457/volumes" Dec 16 08:14:28 crc kubenswrapper[4823]: I1216 08:14:28.134205 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:14:28 crc kubenswrapper[4823]: I1216 08:14:28.134930 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.543832 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5vm4d"] Dec 16 08:14:47 crc kubenswrapper[4823]: E1216 08:14:47.546290 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="extract-utilities" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.546459 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="extract-utilities" Dec 16 08:14:47 crc kubenswrapper[4823]: E1216 08:14:47.546644 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="registry-server" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.546824 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="registry-server" Dec 16 08:14:47 crc kubenswrapper[4823]: E1216 08:14:47.547002 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="extract-content" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.547173 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="extract-content" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.547721 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1274d3e0-24cf-4876-96d6-18fcc1bcf457" containerName="registry-server" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.550329 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.557524 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vm4d"] Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.674967 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9x6c\" (UniqueName: \"kubernetes.io/projected/ee0d93eb-ac78-4ca8-8335-a85c5789999e-kube-api-access-c9x6c\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.675397 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-catalog-content\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.675571 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-utilities\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.776754 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-catalog-content\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.777355 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-catalog-content\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.777261 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-utilities\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.777515 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9x6c\" (UniqueName: \"kubernetes.io/projected/ee0d93eb-ac78-4ca8-8335-a85c5789999e-kube-api-access-c9x6c\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.777701 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-utilities\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.801014 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9x6c\" (UniqueName: \"kubernetes.io/projected/ee0d93eb-ac78-4ca8-8335-a85c5789999e-kube-api-access-c9x6c\") pod \"redhat-marketplace-5vm4d\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:47 crc kubenswrapper[4823]: I1216 08:14:47.874359 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:48 crc kubenswrapper[4823]: I1216 08:14:48.489950 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vm4d"] Dec 16 08:14:49 crc kubenswrapper[4823]: I1216 08:14:49.117981 4823 generic.go:334] "Generic (PLEG): container finished" podID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerID="43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8" exitCode=0 Dec 16 08:14:49 crc kubenswrapper[4823]: I1216 08:14:49.118091 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerDied","Data":"43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8"} Dec 16 08:14:49 crc kubenswrapper[4823]: I1216 08:14:49.118358 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerStarted","Data":"3f462563a945f038890cc88293b3524000c1bdd9bc715221b64d8f6e56f9f1bc"} Dec 16 08:14:50 crc kubenswrapper[4823]: I1216 08:14:50.127858 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerStarted","Data":"4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c"} Dec 16 08:14:51 crc kubenswrapper[4823]: I1216 08:14:51.142559 4823 generic.go:334] "Generic (PLEG): container finished" podID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerID="4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c" exitCode=0 Dec 16 08:14:51 crc kubenswrapper[4823]: I1216 08:14:51.142622 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerDied","Data":"4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c"} Dec 16 08:14:52 crc kubenswrapper[4823]: I1216 08:14:52.152147 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerStarted","Data":"b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2"} Dec 16 08:14:52 crc kubenswrapper[4823]: I1216 08:14:52.178392 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5vm4d" podStartSLOduration=2.6843521949999998 podStartE2EDuration="5.17837141s" podCreationTimestamp="2025-12-16 08:14:47 +0000 UTC" firstStartedPulling="2025-12-16 08:14:49.119735377 +0000 UTC m=+4767.608301510" lastFinishedPulling="2025-12-16 08:14:51.613754562 +0000 UTC m=+4770.102320725" observedRunningTime="2025-12-16 08:14:52.174080765 +0000 UTC m=+4770.662646898" watchObservedRunningTime="2025-12-16 08:14:52.17837141 +0000 UTC m=+4770.666937533" Dec 16 08:14:57 crc kubenswrapper[4823]: I1216 08:14:57.875577 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:57 crc kubenswrapper[4823]: I1216 08:14:57.876660 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:57 crc kubenswrapper[4823]: I1216 08:14:57.918270 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:58 crc kubenswrapper[4823]: I1216 08:14:58.134599 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:14:58 crc kubenswrapper[4823]: I1216 08:14:58.134671 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:14:58 crc kubenswrapper[4823]: I1216 08:14:58.239226 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:14:58 crc kubenswrapper[4823]: I1216 08:14:58.290942 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vm4d"] Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.153842 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m"] Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.155264 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.157798 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.157839 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.164405 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m"] Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.184402 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmxd\" (UniqueName: \"kubernetes.io/projected/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-kube-api-access-7gmxd\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.184471 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-config-volume\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.184547 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-secret-volume\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.213300 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5vm4d" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="registry-server" containerID="cri-o://b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2" gracePeriod=2 Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.285198 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmxd\" (UniqueName: \"kubernetes.io/projected/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-kube-api-access-7gmxd\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.285256 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-config-volume\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.285295 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-secret-volume\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.286764 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-config-volume\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.290971 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-secret-volume\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.301556 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmxd\" (UniqueName: \"kubernetes.io/projected/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-kube-api-access-7gmxd\") pod \"collect-profiles-29431215-45r5m\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.476007 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.582638 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.690280 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-catalog-content\") pod \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.690427 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-utilities\") pod \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.690498 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9x6c\" (UniqueName: \"kubernetes.io/projected/ee0d93eb-ac78-4ca8-8335-a85c5789999e-kube-api-access-c9x6c\") pod \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\" (UID: \"ee0d93eb-ac78-4ca8-8335-a85c5789999e\") " Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.691528 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-utilities" (OuterVolumeSpecName: "utilities") pod "ee0d93eb-ac78-4ca8-8335-a85c5789999e" (UID: "ee0d93eb-ac78-4ca8-8335-a85c5789999e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.696118 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0d93eb-ac78-4ca8-8335-a85c5789999e-kube-api-access-c9x6c" (OuterVolumeSpecName: "kube-api-access-c9x6c") pod "ee0d93eb-ac78-4ca8-8335-a85c5789999e" (UID: "ee0d93eb-ac78-4ca8-8335-a85c5789999e"). InnerVolumeSpecName "kube-api-access-c9x6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.721163 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee0d93eb-ac78-4ca8-8335-a85c5789999e" (UID: "ee0d93eb-ac78-4ca8-8335-a85c5789999e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.792984 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9x6c\" (UniqueName: \"kubernetes.io/projected/ee0d93eb-ac78-4ca8-8335-a85c5789999e-kube-api-access-c9x6c\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.793358 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.793386 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0d93eb-ac78-4ca8-8335-a85c5789999e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:00 crc kubenswrapper[4823]: I1216 08:15:00.884501 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m"] Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.223349 4823 generic.go:334] "Generic (PLEG): container finished" podID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerID="b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2" exitCode=0 Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.223410 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vm4d" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.223449 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerDied","Data":"b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2"} Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.224743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vm4d" event={"ID":"ee0d93eb-ac78-4ca8-8335-a85c5789999e","Type":"ContainerDied","Data":"3f462563a945f038890cc88293b3524000c1bdd9bc715221b64d8f6e56f9f1bc"} Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.224798 4823 scope.go:117] "RemoveContainer" containerID="b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.226267 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" event={"ID":"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb","Type":"ContainerStarted","Data":"888b38634465b11546f4c380d25cb294453abb12e8470b429e8c3ade6a42b4cf"} Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.226319 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" event={"ID":"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb","Type":"ContainerStarted","Data":"bfe035b0defc11835fed6198d0ddd73a0c9d4900ba4011280169d8786df7a6ae"} Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.254991 4823 scope.go:117] "RemoveContainer" containerID="4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.258574 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" podStartSLOduration=1.258561287 podStartE2EDuration="1.258561287s" podCreationTimestamp="2025-12-16 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:15:01.252630571 +0000 UTC m=+4779.741196704" watchObservedRunningTime="2025-12-16 08:15:01.258561287 +0000 UTC m=+4779.747127410" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.272506 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vm4d"] Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.277921 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vm4d"] Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.308120 4823 scope.go:117] "RemoveContainer" containerID="43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.338611 4823 scope.go:117] "RemoveContainer" containerID="b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2" Dec 16 08:15:01 crc kubenswrapper[4823]: E1216 08:15:01.339246 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2\": container with ID starting with b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2 not found: ID does not exist" containerID="b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.339290 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2"} err="failed to get container status \"b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2\": rpc error: code = NotFound desc = could not find container \"b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2\": container with ID starting with b09dfbb7c725f1beaa5f0c6cf022ef4193fd362a1115214fa584f9574b135ed2 not found: ID does not exist" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.339323 4823 scope.go:117] "RemoveContainer" containerID="4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c" Dec 16 08:15:01 crc kubenswrapper[4823]: E1216 08:15:01.339775 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c\": container with ID starting with 4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c not found: ID does not exist" containerID="4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.339818 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c"} err="failed to get container status \"4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c\": rpc error: code = NotFound desc = could not find container \"4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c\": container with ID starting with 4e97daf047e0229870385a01d07983e975ce2cb2224604492c172c22db86020c not found: ID does not exist" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.339863 4823 scope.go:117] "RemoveContainer" containerID="43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8" Dec 16 08:15:01 crc kubenswrapper[4823]: E1216 08:15:01.340297 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8\": container with ID starting with 43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8 not found: ID does not exist" containerID="43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.340324 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8"} err="failed to get container status \"43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8\": rpc error: code = NotFound desc = could not find container \"43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8\": container with ID starting with 43866fd31978487f43e569cffe2c3097ab546b63dd178415e72d08b4682077f8 not found: ID does not exist" Dec 16 08:15:01 crc kubenswrapper[4823]: I1216 08:15:01.787794 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" path="/var/lib/kubelet/pods/ee0d93eb-ac78-4ca8-8335-a85c5789999e/volumes" Dec 16 08:15:02 crc kubenswrapper[4823]: I1216 08:15:02.240555 4823 generic.go:334] "Generic (PLEG): container finished" podID="67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" containerID="888b38634465b11546f4c380d25cb294453abb12e8470b429e8c3ade6a42b4cf" exitCode=0 Dec 16 08:15:02 crc kubenswrapper[4823]: I1216 08:15:02.240635 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" event={"ID":"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb","Type":"ContainerDied","Data":"888b38634465b11546f4c380d25cb294453abb12e8470b429e8c3ade6a42b4cf"} Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.501446 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.632837 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-config-volume\") pod \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.633015 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-secret-volume\") pod \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.633137 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmxd\" (UniqueName: \"kubernetes.io/projected/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-kube-api-access-7gmxd\") pod \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\" (UID: \"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb\") " Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.633637 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-config-volume" (OuterVolumeSpecName: "config-volume") pod "67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" (UID: "67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.639514 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" (UID: "67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.640244 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-kube-api-access-7gmxd" (OuterVolumeSpecName: "kube-api-access-7gmxd") pod "67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" (UID: "67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb"). InnerVolumeSpecName "kube-api-access-7gmxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.735281 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.735323 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:03 crc kubenswrapper[4823]: I1216 08:15:03.735336 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmxd\" (UniqueName: \"kubernetes.io/projected/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb-kube-api-access-7gmxd\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:04 crc kubenswrapper[4823]: I1216 08:15:04.334270 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" event={"ID":"67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb","Type":"ContainerDied","Data":"bfe035b0defc11835fed6198d0ddd73a0c9d4900ba4011280169d8786df7a6ae"} Dec 16 08:15:04 crc kubenswrapper[4823]: I1216 08:15:04.334312 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfe035b0defc11835fed6198d0ddd73a0c9d4900ba4011280169d8786df7a6ae" Dec 16 08:15:04 crc kubenswrapper[4823]: I1216 08:15:04.334395 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m" Dec 16 08:15:04 crc kubenswrapper[4823]: I1216 08:15:04.385350 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292"] Dec 16 08:15:04 crc kubenswrapper[4823]: I1216 08:15:04.390664 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-ng292"] Dec 16 08:15:05 crc kubenswrapper[4823]: I1216 08:15:05.782821 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d1f858-3be7-4e76-99be-0eda5f3f7595" path="/var/lib/kubelet/pods/97d1f858-3be7-4e76-99be-0eda5f3f7595/volumes" Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.134520 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.135255 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.135326 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.136122 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77f0a15859e1c465ef458a6baa328e3fa75c551af5fcc90bf59d9cd46ccb3c67"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.136192 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://77f0a15859e1c465ef458a6baa328e3fa75c551af5fcc90bf59d9cd46ccb3c67" gracePeriod=600 Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.593508 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="77f0a15859e1c465ef458a6baa328e3fa75c551af5fcc90bf59d9cd46ccb3c67" exitCode=0 Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.593588 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"77f0a15859e1c465ef458a6baa328e3fa75c551af5fcc90bf59d9cd46ccb3c67"} Dec 16 08:15:28 crc kubenswrapper[4823]: I1216 08:15:28.593659 4823 scope.go:117] "RemoveContainer" containerID="45952c178049555424b8721fc9a52334bd5e546f8c44fee9f8bebca7e7973c92" Dec 16 08:15:29 crc kubenswrapper[4823]: I1216 08:15:29.606510 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea"} Dec 16 08:15:41 crc kubenswrapper[4823]: I1216 08:15:41.523820 4823 scope.go:117] "RemoveContainer" containerID="6b8d8117e276881284b088bf5e8d963380dfc7fc2c607547925104130ea3d392" Dec 16 08:17:28 crc kubenswrapper[4823]: I1216 08:17:28.134491 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:17:28 crc kubenswrapper[4823]: I1216 08:17:28.135104 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.161263 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgqtw"] Dec 16 08:17:36 crc kubenswrapper[4823]: E1216 08:17:36.162174 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="extract-content" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.162199 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="extract-content" Dec 16 08:17:36 crc kubenswrapper[4823]: E1216 08:17:36.162219 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="registry-server" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.162227 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="registry-server" Dec 16 08:17:36 crc kubenswrapper[4823]: E1216 08:17:36.162242 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" containerName="collect-profiles" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.162249 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" containerName="collect-profiles" Dec 16 08:17:36 crc kubenswrapper[4823]: E1216 08:17:36.162259 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="extract-utilities" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.162268 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="extract-utilities" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.162427 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" containerName="collect-profiles" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.162454 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0d93eb-ac78-4ca8-8335-a85c5789999e" containerName="registry-server" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.163692 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.190141 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgqtw"] Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.306189 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcs9z\" (UniqueName: \"kubernetes.io/projected/820bdf07-9a25-4ea2-8bc1-cd65f2820432-kube-api-access-tcs9z\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.306267 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-utilities\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.306351 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-catalog-content\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.407500 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-catalog-content\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.407588 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcs9z\" (UniqueName: \"kubernetes.io/projected/820bdf07-9a25-4ea2-8bc1-cd65f2820432-kube-api-access-tcs9z\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.407628 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-utilities\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.408016 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-catalog-content\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.408053 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-utilities\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.428597 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcs9z\" (UniqueName: \"kubernetes.io/projected/820bdf07-9a25-4ea2-8bc1-cd65f2820432-kube-api-access-tcs9z\") pod \"community-operators-hgqtw\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.483656 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:36 crc kubenswrapper[4823]: I1216 08:17:36.733789 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgqtw"] Dec 16 08:17:37 crc kubenswrapper[4823]: I1216 08:17:37.533694 4823 generic.go:334] "Generic (PLEG): container finished" podID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerID="e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2" exitCode=0 Dec 16 08:17:37 crc kubenswrapper[4823]: I1216 08:17:37.533760 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgqtw" event={"ID":"820bdf07-9a25-4ea2-8bc1-cd65f2820432","Type":"ContainerDied","Data":"e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2"} Dec 16 08:17:37 crc kubenswrapper[4823]: I1216 08:17:37.534760 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgqtw" event={"ID":"820bdf07-9a25-4ea2-8bc1-cd65f2820432","Type":"ContainerStarted","Data":"15b65b359c44a3f00492f354a9d21f0c86edd3cf0232ac302110c6067388be0e"} Dec 16 08:17:37 crc kubenswrapper[4823]: I1216 08:17:37.535914 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:17:41 crc kubenswrapper[4823]: I1216 08:17:41.562551 4823 generic.go:334] "Generic (PLEG): container finished" podID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerID="3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a" exitCode=0 Dec 16 08:17:41 crc kubenswrapper[4823]: I1216 08:17:41.562585 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgqtw" event={"ID":"820bdf07-9a25-4ea2-8bc1-cd65f2820432","Type":"ContainerDied","Data":"3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a"} Dec 16 08:17:42 crc kubenswrapper[4823]: I1216 08:17:42.571474 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgqtw" event={"ID":"820bdf07-9a25-4ea2-8bc1-cd65f2820432","Type":"ContainerStarted","Data":"203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c"} Dec 16 08:17:46 crc kubenswrapper[4823]: I1216 08:17:46.484919 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:46 crc kubenswrapper[4823]: I1216 08:17:46.485296 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:46 crc kubenswrapper[4823]: I1216 08:17:46.527704 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:46 crc kubenswrapper[4823]: I1216 08:17:46.545322 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgqtw" podStartSLOduration=5.993723295 podStartE2EDuration="10.545295074s" podCreationTimestamp="2025-12-16 08:17:36 +0000 UTC" firstStartedPulling="2025-12-16 08:17:37.535567955 +0000 UTC m=+4936.024134078" lastFinishedPulling="2025-12-16 08:17:42.087139734 +0000 UTC m=+4940.575705857" observedRunningTime="2025-12-16 08:17:42.591257845 +0000 UTC m=+4941.079823968" watchObservedRunningTime="2025-12-16 08:17:46.545295074 +0000 UTC m=+4945.033861207" Dec 16 08:17:56 crc kubenswrapper[4823]: I1216 08:17:56.533550 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:56 crc kubenswrapper[4823]: I1216 08:17:56.583967 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgqtw"] Dec 16 08:17:56 crc kubenswrapper[4823]: I1216 08:17:56.672891 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgqtw" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="registry-server" containerID="cri-o://203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c" gracePeriod=2 Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.134441 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.134734 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.232757 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.337806 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-catalog-content\") pod \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.337944 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcs9z\" (UniqueName: \"kubernetes.io/projected/820bdf07-9a25-4ea2-8bc1-cd65f2820432-kube-api-access-tcs9z\") pod \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.338048 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-utilities\") pod \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\" (UID: \"820bdf07-9a25-4ea2-8bc1-cd65f2820432\") " Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.343108 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-utilities" (OuterVolumeSpecName: "utilities") pod "820bdf07-9a25-4ea2-8bc1-cd65f2820432" (UID: "820bdf07-9a25-4ea2-8bc1-cd65f2820432"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.349303 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820bdf07-9a25-4ea2-8bc1-cd65f2820432-kube-api-access-tcs9z" (OuterVolumeSpecName: "kube-api-access-tcs9z") pod "820bdf07-9a25-4ea2-8bc1-cd65f2820432" (UID: "820bdf07-9a25-4ea2-8bc1-cd65f2820432"). InnerVolumeSpecName "kube-api-access-tcs9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.396868 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "820bdf07-9a25-4ea2-8bc1-cd65f2820432" (UID: "820bdf07-9a25-4ea2-8bc1-cd65f2820432"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.439136 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.439178 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820bdf07-9a25-4ea2-8bc1-cd65f2820432-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.439195 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcs9z\" (UniqueName: \"kubernetes.io/projected/820bdf07-9a25-4ea2-8bc1-cd65f2820432-kube-api-access-tcs9z\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.689466 4823 generic.go:334] "Generic (PLEG): container finished" podID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerID="203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c" exitCode=0 Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.689519 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgqtw" event={"ID":"820bdf07-9a25-4ea2-8bc1-cd65f2820432","Type":"ContainerDied","Data":"203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c"} Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.689554 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgqtw" event={"ID":"820bdf07-9a25-4ea2-8bc1-cd65f2820432","Type":"ContainerDied","Data":"15b65b359c44a3f00492f354a9d21f0c86edd3cf0232ac302110c6067388be0e"} Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.689575 4823 scope.go:117] "RemoveContainer" containerID="203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.689573 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgqtw" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.718890 4823 scope.go:117] "RemoveContainer" containerID="3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.736177 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgqtw"] Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.753910 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgqtw"] Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.765523 4823 scope.go:117] "RemoveContainer" containerID="e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.788128 4823 scope.go:117] "RemoveContainer" containerID="203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c" Dec 16 08:17:58 crc kubenswrapper[4823]: E1216 08:17:58.788617 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c\": container with ID starting with 203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c not found: ID does not exist" containerID="203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.788680 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c"} err="failed to get container status \"203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c\": rpc error: code = NotFound desc = could not find container \"203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c\": container with ID starting with 203aeff68e6cc4bf696cd36824e02aa4298bfe88f71c9d61d53c6e8fb660829c not found: ID does not exist" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.788708 4823 scope.go:117] "RemoveContainer" containerID="3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a" Dec 16 08:17:58 crc kubenswrapper[4823]: E1216 08:17:58.789313 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a\": container with ID starting with 3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a not found: ID does not exist" containerID="3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.789356 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a"} err="failed to get container status \"3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a\": rpc error: code = NotFound desc = could not find container \"3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a\": container with ID starting with 3bbd6a57b0a034d93fc71388a263d3cab784f78ff6af3e75576887e6acc56b5a not found: ID does not exist" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.789382 4823 scope.go:117] "RemoveContainer" containerID="e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2" Dec 16 08:17:58 crc kubenswrapper[4823]: E1216 08:17:58.789757 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2\": container with ID starting with e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2 not found: ID does not exist" containerID="e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2" Dec 16 08:17:58 crc kubenswrapper[4823]: I1216 08:17:58.789799 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2"} err="failed to get container status \"e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2\": rpc error: code = NotFound desc = could not find container \"e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2\": container with ID starting with e744e0fd8a43f6aa84a088c434a698f1fb051d84a7f3543339d6bac168bac7c2 not found: ID does not exist" Dec 16 08:17:59 crc kubenswrapper[4823]: I1216 08:17:59.781497 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" path="/var/lib/kubelet/pods/820bdf07-9a25-4ea2-8bc1-cd65f2820432/volumes" Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.134525 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.135334 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.135418 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.136582 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.136708 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" gracePeriod=600 Dec 16 08:18:28 crc kubenswrapper[4823]: E1216 08:18:28.283287 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.933206 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" exitCode=0 Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.933448 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea"} Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.933556 4823 scope.go:117] "RemoveContainer" containerID="77f0a15859e1c465ef458a6baa328e3fa75c551af5fcc90bf59d9cd46ccb3c67" Dec 16 08:18:28 crc kubenswrapper[4823]: I1216 08:18:28.933945 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:18:28 crc kubenswrapper[4823]: E1216 08:18:28.934281 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:18:39 crc kubenswrapper[4823]: I1216 08:18:39.772155 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:18:39 crc kubenswrapper[4823]: E1216 08:18:39.772970 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:18:53 crc kubenswrapper[4823]: I1216 08:18:53.772735 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:18:53 crc kubenswrapper[4823]: E1216 08:18:53.774828 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:19:06 crc kubenswrapper[4823]: I1216 08:19:06.771732 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:19:06 crc kubenswrapper[4823]: E1216 08:19:06.772742 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.891405 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xn64w"] Dec 16 08:19:14 crc kubenswrapper[4823]: E1216 08:19:14.892236 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="registry-server" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.892250 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="registry-server" Dec 16 08:19:14 crc kubenswrapper[4823]: E1216 08:19:14.892279 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="extract-content" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.892285 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="extract-content" Dec 16 08:19:14 crc kubenswrapper[4823]: E1216 08:19:14.892296 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="extract-utilities" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.892303 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="extract-utilities" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.892417 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="820bdf07-9a25-4ea2-8bc1-cd65f2820432" containerName="registry-server" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.893536 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.907867 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xn64w"] Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.968585 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-catalog-content\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.968651 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjfb\" (UniqueName: \"kubernetes.io/projected/d39035a8-d132-4c9f-9508-e8c3daff8c62-kube-api-access-hzjfb\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:14 crc kubenswrapper[4823]: I1216 08:19:14.968728 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-utilities\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.070191 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-catalog-content\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.070250 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjfb\" (UniqueName: \"kubernetes.io/projected/d39035a8-d132-4c9f-9508-e8c3daff8c62-kube-api-access-hzjfb\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.070323 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-utilities\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.070776 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-utilities\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.070992 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-catalog-content\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.094336 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjfb\" (UniqueName: \"kubernetes.io/projected/d39035a8-d132-4c9f-9508-e8c3daff8c62-kube-api-access-hzjfb\") pod \"certified-operators-xn64w\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.231381 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:15 crc kubenswrapper[4823]: I1216 08:19:15.530363 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xn64w"] Dec 16 08:19:16 crc kubenswrapper[4823]: I1216 08:19:16.292669 4823 generic.go:334] "Generic (PLEG): container finished" podID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerID="afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea" exitCode=0 Dec 16 08:19:16 crc kubenswrapper[4823]: I1216 08:19:16.292737 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerDied","Data":"afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea"} Dec 16 08:19:16 crc kubenswrapper[4823]: I1216 08:19:16.293122 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerStarted","Data":"19ca87697ae95bb97b56547f0901fbb2323a6e4abcb4447778339e9150921648"} Dec 16 08:19:17 crc kubenswrapper[4823]: I1216 08:19:17.305628 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerStarted","Data":"87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768"} Dec 16 08:19:17 crc kubenswrapper[4823]: I1216 08:19:17.772440 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:19:17 crc kubenswrapper[4823]: E1216 08:19:17.772929 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:19:18 crc kubenswrapper[4823]: I1216 08:19:18.315002 4823 generic.go:334] "Generic (PLEG): container finished" podID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerID="87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768" exitCode=0 Dec 16 08:19:18 crc kubenswrapper[4823]: I1216 08:19:18.315075 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerDied","Data":"87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768"} Dec 16 08:19:19 crc kubenswrapper[4823]: I1216 08:19:19.324710 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerStarted","Data":"4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367"} Dec 16 08:19:19 crc kubenswrapper[4823]: I1216 08:19:19.348888 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xn64w" podStartSLOduration=2.83356568 podStartE2EDuration="5.348864762s" podCreationTimestamp="2025-12-16 08:19:14 +0000 UTC" firstStartedPulling="2025-12-16 08:19:16.294575626 +0000 UTC m=+5034.783141749" lastFinishedPulling="2025-12-16 08:19:18.809874698 +0000 UTC m=+5037.298440831" observedRunningTime="2025-12-16 08:19:19.343387721 +0000 UTC m=+5037.831953864" watchObservedRunningTime="2025-12-16 08:19:19.348864762 +0000 UTC m=+5037.837430885" Dec 16 08:19:25 crc kubenswrapper[4823]: I1216 08:19:25.232103 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:25 crc kubenswrapper[4823]: I1216 08:19:25.232819 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:25 crc kubenswrapper[4823]: I1216 08:19:25.279137 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:25 crc kubenswrapper[4823]: I1216 08:19:25.408202 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:25 crc kubenswrapper[4823]: I1216 08:19:25.511917 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xn64w"] Dec 16 08:19:27 crc kubenswrapper[4823]: I1216 08:19:27.386168 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xn64w" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="registry-server" containerID="cri-o://4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367" gracePeriod=2 Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.292399 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.358752 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzjfb\" (UniqueName: \"kubernetes.io/projected/d39035a8-d132-4c9f-9508-e8c3daff8c62-kube-api-access-hzjfb\") pod \"d39035a8-d132-4c9f-9508-e8c3daff8c62\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.358826 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-catalog-content\") pod \"d39035a8-d132-4c9f-9508-e8c3daff8c62\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.358908 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-utilities\") pod \"d39035a8-d132-4c9f-9508-e8c3daff8c62\" (UID: \"d39035a8-d132-4c9f-9508-e8c3daff8c62\") " Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.360104 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-utilities" (OuterVolumeSpecName: "utilities") pod "d39035a8-d132-4c9f-9508-e8c3daff8c62" (UID: "d39035a8-d132-4c9f-9508-e8c3daff8c62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.375476 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39035a8-d132-4c9f-9508-e8c3daff8c62-kube-api-access-hzjfb" (OuterVolumeSpecName: "kube-api-access-hzjfb") pod "d39035a8-d132-4c9f-9508-e8c3daff8c62" (UID: "d39035a8-d132-4c9f-9508-e8c3daff8c62"). InnerVolumeSpecName "kube-api-access-hzjfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.394251 4823 generic.go:334] "Generic (PLEG): container finished" podID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerID="4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367" exitCode=0 Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.394296 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerDied","Data":"4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367"} Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.394326 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xn64w" event={"ID":"d39035a8-d132-4c9f-9508-e8c3daff8c62","Type":"ContainerDied","Data":"19ca87697ae95bb97b56547f0901fbb2323a6e4abcb4447778339e9150921648"} Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.394344 4823 scope.go:117] "RemoveContainer" containerID="4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.394464 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xn64w" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.412649 4823 scope.go:117] "RemoveContainer" containerID="87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.423887 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d39035a8-d132-4c9f-9508-e8c3daff8c62" (UID: "d39035a8-d132-4c9f-9508-e8c3daff8c62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.429845 4823 scope.go:117] "RemoveContainer" containerID="afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.453983 4823 scope.go:117] "RemoveContainer" containerID="4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367" Dec 16 08:19:28 crc kubenswrapper[4823]: E1216 08:19:28.454916 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367\": container with ID starting with 4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367 not found: ID does not exist" containerID="4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.455016 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367"} err="failed to get container status \"4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367\": rpc error: code = NotFound desc = could not find container \"4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367\": container with ID starting with 4e56212c4d18f4383776ef060a221e34547bb0f7c77c2617247d0db025e44367 not found: ID does not exist" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.455104 4823 scope.go:117] "RemoveContainer" containerID="87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768" Dec 16 08:19:28 crc kubenswrapper[4823]: E1216 08:19:28.455803 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768\": container with ID starting with 87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768 not found: ID does not exist" containerID="87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.455857 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768"} err="failed to get container status \"87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768\": rpc error: code = NotFound desc = could not find container \"87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768\": container with ID starting with 87278d64ad439eac3ca509c508b9ff8c9ee892fdad3106e6b8e5902d582c1768 not found: ID does not exist" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.455900 4823 scope.go:117] "RemoveContainer" containerID="afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea" Dec 16 08:19:28 crc kubenswrapper[4823]: E1216 08:19:28.456277 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea\": container with ID starting with afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea not found: ID does not exist" containerID="afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.456312 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea"} err="failed to get container status \"afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea\": rpc error: code = NotFound desc = could not find container \"afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea\": container with ID starting with afd39d5a6fffb6394650587f7510442782abc93e01384f16c19b2675a5e494ea not found: ID does not exist" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.460724 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzjfb\" (UniqueName: \"kubernetes.io/projected/d39035a8-d132-4c9f-9508-e8c3daff8c62-kube-api-access-hzjfb\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.460788 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.460887 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d39035a8-d132-4c9f-9508-e8c3daff8c62-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.730976 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xn64w"] Dec 16 08:19:28 crc kubenswrapper[4823]: I1216 08:19:28.738920 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xn64w"] Dec 16 08:19:29 crc kubenswrapper[4823]: I1216 08:19:29.772669 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:19:29 crc kubenswrapper[4823]: E1216 08:19:29.773187 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:19:29 crc kubenswrapper[4823]: I1216 08:19:29.780829 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" path="/var/lib/kubelet/pods/d39035a8-d132-4c9f-9508-e8c3daff8c62/volumes" Dec 16 08:19:44 crc kubenswrapper[4823]: I1216 08:19:44.771867 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:19:44 crc kubenswrapper[4823]: E1216 08:19:44.772558 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:19:58 crc kubenswrapper[4823]: I1216 08:19:58.771893 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:19:58 crc kubenswrapper[4823]: E1216 08:19:58.773129 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:20:10 crc kubenswrapper[4823]: I1216 08:20:10.771946 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:20:10 crc kubenswrapper[4823]: E1216 08:20:10.772488 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:20:23 crc kubenswrapper[4823]: I1216 08:20:23.771849 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:20:23 crc kubenswrapper[4823]: E1216 08:20:23.772574 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:20:36 crc kubenswrapper[4823]: I1216 08:20:36.771864 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:20:36 crc kubenswrapper[4823]: E1216 08:20:36.772631 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:20:52 crc kubenswrapper[4823]: I1216 08:20:52.771246 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:20:52 crc kubenswrapper[4823]: E1216 08:20:52.771961 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:21:05 crc kubenswrapper[4823]: I1216 08:21:05.771612 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:21:05 crc kubenswrapper[4823]: E1216 08:21:05.772349 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:21:16 crc kubenswrapper[4823]: I1216 08:21:16.771593 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:21:16 crc kubenswrapper[4823]: E1216 08:21:16.772239 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:21:31 crc kubenswrapper[4823]: I1216 08:21:31.774875 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:21:31 crc kubenswrapper[4823]: E1216 08:21:31.775515 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:21:46 crc kubenswrapper[4823]: I1216 08:21:46.771841 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:21:46 crc kubenswrapper[4823]: E1216 08:21:46.772552 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:21:59 crc kubenswrapper[4823]: I1216 08:21:59.773823 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:21:59 crc kubenswrapper[4823]: E1216 08:21:59.774584 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:22:10 crc kubenswrapper[4823]: I1216 08:22:10.771945 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:22:10 crc kubenswrapper[4823]: E1216 08:22:10.772794 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:22:21 crc kubenswrapper[4823]: I1216 08:22:21.775457 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:22:21 crc kubenswrapper[4823]: E1216 08:22:21.776209 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:22:35 crc kubenswrapper[4823]: I1216 08:22:35.771376 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:22:35 crc kubenswrapper[4823]: E1216 08:22:35.772111 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:22:48 crc kubenswrapper[4823]: I1216 08:22:48.771510 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:22:48 crc kubenswrapper[4823]: E1216 08:22:48.772397 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:23:00 crc kubenswrapper[4823]: I1216 08:23:00.771685 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:23:00 crc kubenswrapper[4823]: E1216 08:23:00.772488 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:23:14 crc kubenswrapper[4823]: I1216 08:23:14.771711 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:23:14 crc kubenswrapper[4823]: E1216 08:23:14.774040 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:23:26 crc kubenswrapper[4823]: I1216 08:23:26.771952 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:23:26 crc kubenswrapper[4823]: E1216 08:23:26.772717 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.583658 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlc8c"] Dec 16 08:23:34 crc kubenswrapper[4823]: E1216 08:23:34.587449 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="registry-server" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.587471 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="registry-server" Dec 16 08:23:34 crc kubenswrapper[4823]: E1216 08:23:34.587489 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="extract-content" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.587498 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="extract-content" Dec 16 08:23:34 crc kubenswrapper[4823]: E1216 08:23:34.587517 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="extract-utilities" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.587526 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="extract-utilities" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.587710 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39035a8-d132-4c9f-9508-e8c3daff8c62" containerName="registry-server" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.591070 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.597758 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlc8c"] Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.764987 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gslr4\" (UniqueName: \"kubernetes.io/projected/22c55405-5bcf-42b4-a1d9-b0828c630b9f-kube-api-access-gslr4\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.765111 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-catalog-content\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.765143 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-utilities\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.866826 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-utilities\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.866949 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gslr4\" (UniqueName: \"kubernetes.io/projected/22c55405-5bcf-42b4-a1d9-b0828c630b9f-kube-api-access-gslr4\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.867062 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-catalog-content\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.867593 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-utilities\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.867618 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-catalog-content\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.895045 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gslr4\" (UniqueName: \"kubernetes.io/projected/22c55405-5bcf-42b4-a1d9-b0828c630b9f-kube-api-access-gslr4\") pod \"redhat-operators-wlc8c\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:34 crc kubenswrapper[4823]: I1216 08:23:34.912565 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:35 crc kubenswrapper[4823]: I1216 08:23:35.339668 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlc8c"] Dec 16 08:23:36 crc kubenswrapper[4823]: I1216 08:23:36.136436 4823 generic.go:334] "Generic (PLEG): container finished" podID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerID="2e61a365077d500ce96adc2dcbd2a95f956d9d90cb8939352b76adcce5111901" exitCode=0 Dec 16 08:23:36 crc kubenswrapper[4823]: I1216 08:23:36.136542 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlc8c" event={"ID":"22c55405-5bcf-42b4-a1d9-b0828c630b9f","Type":"ContainerDied","Data":"2e61a365077d500ce96adc2dcbd2a95f956d9d90cb8939352b76adcce5111901"} Dec 16 08:23:36 crc kubenswrapper[4823]: I1216 08:23:36.136894 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlc8c" event={"ID":"22c55405-5bcf-42b4-a1d9-b0828c630b9f","Type":"ContainerStarted","Data":"593d00584b85ec3b27d3042cb39b1e8aeee7989f012ef799502f694798e5faca"} Dec 16 08:23:36 crc kubenswrapper[4823]: I1216 08:23:36.139356 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:23:37 crc kubenswrapper[4823]: I1216 08:23:37.772354 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:23:38 crc kubenswrapper[4823]: I1216 08:23:38.158968 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"f47c5c93d0c43e11fad6aac3bb24214b551760b1e7ba2c6d9838c87d916c84c0"} Dec 16 08:23:38 crc kubenswrapper[4823]: I1216 08:23:38.161512 4823 generic.go:334] "Generic (PLEG): container finished" podID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerID="35a5c10289a24251766ad494f45574ff5a5a0665f721d33c6bce0236a1cc605a" exitCode=0 Dec 16 08:23:38 crc kubenswrapper[4823]: I1216 08:23:38.161574 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlc8c" event={"ID":"22c55405-5bcf-42b4-a1d9-b0828c630b9f","Type":"ContainerDied","Data":"35a5c10289a24251766ad494f45574ff5a5a0665f721d33c6bce0236a1cc605a"} Dec 16 08:23:39 crc kubenswrapper[4823]: I1216 08:23:39.174065 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlc8c" event={"ID":"22c55405-5bcf-42b4-a1d9-b0828c630b9f","Type":"ContainerStarted","Data":"1a58cdebff17aec96060bc962990ce7866a42c0cd1a5348c2331b93bc89d3501"} Dec 16 08:23:44 crc kubenswrapper[4823]: I1216 08:23:44.913482 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:44 crc kubenswrapper[4823]: I1216 08:23:44.914078 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:44 crc kubenswrapper[4823]: I1216 08:23:44.954503 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:44 crc kubenswrapper[4823]: I1216 08:23:44.974478 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlc8c" podStartSLOduration=8.486206916 podStartE2EDuration="10.97445742s" podCreationTimestamp="2025-12-16 08:23:34 +0000 UTC" firstStartedPulling="2025-12-16 08:23:36.13892563 +0000 UTC m=+5294.627491763" lastFinishedPulling="2025-12-16 08:23:38.627176144 +0000 UTC m=+5297.115742267" observedRunningTime="2025-12-16 08:23:39.202822008 +0000 UTC m=+5297.691388131" watchObservedRunningTime="2025-12-16 08:23:44.97445742 +0000 UTC m=+5303.463023543" Dec 16 08:23:45 crc kubenswrapper[4823]: I1216 08:23:45.277775 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:45 crc kubenswrapper[4823]: I1216 08:23:45.322269 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlc8c"] Dec 16 08:23:47 crc kubenswrapper[4823]: I1216 08:23:47.255043 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlc8c" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="registry-server" containerID="cri-o://1a58cdebff17aec96060bc962990ce7866a42c0cd1a5348c2331b93bc89d3501" gracePeriod=2 Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.264552 4823 generic.go:334] "Generic (PLEG): container finished" podID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerID="1a58cdebff17aec96060bc962990ce7866a42c0cd1a5348c2331b93bc89d3501" exitCode=0 Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.264612 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlc8c" event={"ID":"22c55405-5bcf-42b4-a1d9-b0828c630b9f","Type":"ContainerDied","Data":"1a58cdebff17aec96060bc962990ce7866a42c0cd1a5348c2331b93bc89d3501"} Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.264791 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlc8c" event={"ID":"22c55405-5bcf-42b4-a1d9-b0828c630b9f","Type":"ContainerDied","Data":"593d00584b85ec3b27d3042cb39b1e8aeee7989f012ef799502f694798e5faca"} Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.264802 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593d00584b85ec3b27d3042cb39b1e8aeee7989f012ef799502f694798e5faca" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.266799 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.357494 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gslr4\" (UniqueName: \"kubernetes.io/projected/22c55405-5bcf-42b4-a1d9-b0828c630b9f-kube-api-access-gslr4\") pod \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.357641 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-utilities\") pod \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.357699 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-catalog-content\") pod \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\" (UID: \"22c55405-5bcf-42b4-a1d9-b0828c630b9f\") " Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.358757 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-utilities" (OuterVolumeSpecName: "utilities") pod "22c55405-5bcf-42b4-a1d9-b0828c630b9f" (UID: "22c55405-5bcf-42b4-a1d9-b0828c630b9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.369613 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c55405-5bcf-42b4-a1d9-b0828c630b9f-kube-api-access-gslr4" (OuterVolumeSpecName: "kube-api-access-gslr4") pod "22c55405-5bcf-42b4-a1d9-b0828c630b9f" (UID: "22c55405-5bcf-42b4-a1d9-b0828c630b9f"). InnerVolumeSpecName "kube-api-access-gslr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.460034 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.460080 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gslr4\" (UniqueName: \"kubernetes.io/projected/22c55405-5bcf-42b4-a1d9-b0828c630b9f-kube-api-access-gslr4\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.933484 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22c55405-5bcf-42b4-a1d9-b0828c630b9f" (UID: "22c55405-5bcf-42b4-a1d9-b0828c630b9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:23:48 crc kubenswrapper[4823]: I1216 08:23:48.967870 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c55405-5bcf-42b4-a1d9-b0828c630b9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:49 crc kubenswrapper[4823]: I1216 08:23:49.271472 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlc8c" Dec 16 08:23:49 crc kubenswrapper[4823]: I1216 08:23:49.304016 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlc8c"] Dec 16 08:23:49 crc kubenswrapper[4823]: I1216 08:23:49.308882 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlc8c"] Dec 16 08:23:49 crc kubenswrapper[4823]: I1216 08:23:49.786682 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" path="/var/lib/kubelet/pods/22c55405-5bcf-42b4-a1d9-b0828c630b9f/volumes" Dec 16 08:25:58 crc kubenswrapper[4823]: I1216 08:25:58.134085 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:25:58 crc kubenswrapper[4823]: I1216 08:25:58.134751 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.649299 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xxnck"] Dec 16 08:26:06 crc kubenswrapper[4823]: E1216 08:26:06.650191 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="extract-content" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.650210 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="extract-content" Dec 16 08:26:06 crc kubenswrapper[4823]: E1216 08:26:06.650229 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="extract-utilities" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.650236 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="extract-utilities" Dec 16 08:26:06 crc kubenswrapper[4823]: E1216 08:26:06.650251 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="registry-server" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.650258 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="registry-server" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.650423 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c55405-5bcf-42b4-a1d9-b0828c630b9f" containerName="registry-server" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.651614 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.672164 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxnck"] Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.840281 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-catalog-content\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.840493 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-utilities\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.840580 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdw4\" (UniqueName: \"kubernetes.io/projected/f6abf0d3-3653-48c9-8d52-cca2e2f97545-kube-api-access-jcdw4\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.941388 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-utilities\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.941447 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdw4\" (UniqueName: \"kubernetes.io/projected/f6abf0d3-3653-48c9-8d52-cca2e2f97545-kube-api-access-jcdw4\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.941496 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-catalog-content\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.942007 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-utilities\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.942054 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-catalog-content\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.961769 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdw4\" (UniqueName: \"kubernetes.io/projected/f6abf0d3-3653-48c9-8d52-cca2e2f97545-kube-api-access-jcdw4\") pod \"redhat-marketplace-xxnck\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:06 crc kubenswrapper[4823]: I1216 08:26:06.973899 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:07 crc kubenswrapper[4823]: I1216 08:26:07.209488 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxnck"] Dec 16 08:26:07 crc kubenswrapper[4823]: I1216 08:26:07.641253 4823 generic.go:334] "Generic (PLEG): container finished" podID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerID="b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df" exitCode=0 Dec 16 08:26:07 crc kubenswrapper[4823]: I1216 08:26:07.641306 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxnck" event={"ID":"f6abf0d3-3653-48c9-8d52-cca2e2f97545","Type":"ContainerDied","Data":"b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df"} Dec 16 08:26:07 crc kubenswrapper[4823]: I1216 08:26:07.641333 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxnck" event={"ID":"f6abf0d3-3653-48c9-8d52-cca2e2f97545","Type":"ContainerStarted","Data":"0c4c617ac337cdda07bf6b541d57670296311cda5906ab9dc1e24c13d424fdcf"} Dec 16 08:26:09 crc kubenswrapper[4823]: I1216 08:26:09.657270 4823 generic.go:334] "Generic (PLEG): container finished" podID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerID="8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c" exitCode=0 Dec 16 08:26:09 crc kubenswrapper[4823]: I1216 08:26:09.657379 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxnck" event={"ID":"f6abf0d3-3653-48c9-8d52-cca2e2f97545","Type":"ContainerDied","Data":"8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c"} Dec 16 08:26:10 crc kubenswrapper[4823]: I1216 08:26:10.666189 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxnck" event={"ID":"f6abf0d3-3653-48c9-8d52-cca2e2f97545","Type":"ContainerStarted","Data":"f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484"} Dec 16 08:26:10 crc kubenswrapper[4823]: I1216 08:26:10.691443 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xxnck" podStartSLOduration=2.090165437 podStartE2EDuration="4.691419662s" podCreationTimestamp="2025-12-16 08:26:06 +0000 UTC" firstStartedPulling="2025-12-16 08:26:07.642499304 +0000 UTC m=+5446.131065417" lastFinishedPulling="2025-12-16 08:26:10.243753519 +0000 UTC m=+5448.732319642" observedRunningTime="2025-12-16 08:26:10.686339262 +0000 UTC m=+5449.174905425" watchObservedRunningTime="2025-12-16 08:26:10.691419662 +0000 UTC m=+5449.179985785" Dec 16 08:26:16 crc kubenswrapper[4823]: I1216 08:26:16.974393 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:16 crc kubenswrapper[4823]: I1216 08:26:16.975128 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:17 crc kubenswrapper[4823]: I1216 08:26:17.025213 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:17 crc kubenswrapper[4823]: I1216 08:26:17.780206 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:18 crc kubenswrapper[4823]: I1216 08:26:18.223873 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxnck"] Dec 16 08:26:19 crc kubenswrapper[4823]: I1216 08:26:19.728306 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xxnck" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="registry-server" containerID="cri-o://f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484" gracePeriod=2 Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.628892 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.735998 4823 generic.go:334] "Generic (PLEG): container finished" podID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerID="f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484" exitCode=0 Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.736055 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxnck" event={"ID":"f6abf0d3-3653-48c9-8d52-cca2e2f97545","Type":"ContainerDied","Data":"f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484"} Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.736078 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxnck" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.736096 4823 scope.go:117] "RemoveContainer" containerID="f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.736083 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxnck" event={"ID":"f6abf0d3-3653-48c9-8d52-cca2e2f97545","Type":"ContainerDied","Data":"0c4c617ac337cdda07bf6b541d57670296311cda5906ab9dc1e24c13d424fdcf"} Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.748766 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-catalog-content\") pod \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.748860 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcdw4\" (UniqueName: \"kubernetes.io/projected/f6abf0d3-3653-48c9-8d52-cca2e2f97545-kube-api-access-jcdw4\") pod \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.748911 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-utilities\") pod \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\" (UID: \"f6abf0d3-3653-48c9-8d52-cca2e2f97545\") " Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.750102 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-utilities" (OuterVolumeSpecName: "utilities") pod "f6abf0d3-3653-48c9-8d52-cca2e2f97545" (UID: "f6abf0d3-3653-48c9-8d52-cca2e2f97545"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.755247 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6abf0d3-3653-48c9-8d52-cca2e2f97545-kube-api-access-jcdw4" (OuterVolumeSpecName: "kube-api-access-jcdw4") pod "f6abf0d3-3653-48c9-8d52-cca2e2f97545" (UID: "f6abf0d3-3653-48c9-8d52-cca2e2f97545"). InnerVolumeSpecName "kube-api-access-jcdw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.769237 4823 scope.go:117] "RemoveContainer" containerID="8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.778538 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6abf0d3-3653-48c9-8d52-cca2e2f97545" (UID: "f6abf0d3-3653-48c9-8d52-cca2e2f97545"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.802754 4823 scope.go:117] "RemoveContainer" containerID="b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.827820 4823 scope.go:117] "RemoveContainer" containerID="f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484" Dec 16 08:26:20 crc kubenswrapper[4823]: E1216 08:26:20.828340 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484\": container with ID starting with f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484 not found: ID does not exist" containerID="f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.828437 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484"} err="failed to get container status \"f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484\": rpc error: code = NotFound desc = could not find container \"f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484\": container with ID starting with f19fdff1c7ce52b08eea558e86c2433b2608e75c4231b8f61c484689f09db484 not found: ID does not exist" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.828480 4823 scope.go:117] "RemoveContainer" containerID="8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c" Dec 16 08:26:20 crc kubenswrapper[4823]: E1216 08:26:20.829177 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c\": container with ID starting with 8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c not found: ID does not exist" containerID="8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.829287 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c"} err="failed to get container status \"8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c\": rpc error: code = NotFound desc = could not find container \"8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c\": container with ID starting with 8021f980c7c2d424e8241ff27999bd9046cf2f2750c5df204e674f58b73e694c not found: ID does not exist" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.829318 4823 scope.go:117] "RemoveContainer" containerID="b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df" Dec 16 08:26:20 crc kubenswrapper[4823]: E1216 08:26:20.829617 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df\": container with ID starting with b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df not found: ID does not exist" containerID="b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.829650 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df"} err="failed to get container status \"b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df\": rpc error: code = NotFound desc = could not find container \"b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df\": container with ID starting with b9b6abf46f97a4e21c703516987dc86f5b64e1e52a56edcb280d058d21a9f7df not found: ID does not exist" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.850738 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.850779 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcdw4\" (UniqueName: \"kubernetes.io/projected/f6abf0d3-3653-48c9-8d52-cca2e2f97545-kube-api-access-jcdw4\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:20 crc kubenswrapper[4823]: I1216 08:26:20.850793 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6abf0d3-3653-48c9-8d52-cca2e2f97545-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:21 crc kubenswrapper[4823]: I1216 08:26:21.070046 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxnck"] Dec 16 08:26:21 crc kubenswrapper[4823]: I1216 08:26:21.075095 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxnck"] Dec 16 08:26:21 crc kubenswrapper[4823]: I1216 08:26:21.781364 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" path="/var/lib/kubelet/pods/f6abf0d3-3653-48c9-8d52-cca2e2f97545/volumes" Dec 16 08:26:28 crc kubenswrapper[4823]: I1216 08:26:28.134116 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:26:28 crc kubenswrapper[4823]: I1216 08:26:28.134668 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.134880 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.136465 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.136645 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.138371 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f47c5c93d0c43e11fad6aac3bb24214b551760b1e7ba2c6d9838c87d916c84c0"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.138477 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://f47c5c93d0c43e11fad6aac3bb24214b551760b1e7ba2c6d9838c87d916c84c0" gracePeriod=600 Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.999446 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="f47c5c93d0c43e11fad6aac3bb24214b551760b1e7ba2c6d9838c87d916c84c0" exitCode=0 Dec 16 08:26:58 crc kubenswrapper[4823]: I1216 08:26:58.999497 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"f47c5c93d0c43e11fad6aac3bb24214b551760b1e7ba2c6d9838c87d916c84c0"} Dec 16 08:26:59 crc kubenswrapper[4823]: I1216 08:26:58.999811 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f"} Dec 16 08:26:59 crc kubenswrapper[4823]: I1216 08:26:58.999839 4823 scope.go:117] "RemoveContainer" containerID="28492ebfdcae248f34331dcd5ce9b91596654f34717e0e98b5167555edcbc9ea" Dec 16 08:28:58 crc kubenswrapper[4823]: I1216 08:28:58.134688 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:28:58 crc kubenswrapper[4823]: I1216 08:28:58.135604 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:29:28 crc kubenswrapper[4823]: I1216 08:29:28.155848 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:29:28 crc kubenswrapper[4823]: I1216 08:29:28.157462 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:29:41 crc kubenswrapper[4823]: I1216 08:29:41.792391 4823 scope.go:117] "RemoveContainer" containerID="1a58cdebff17aec96060bc962990ce7866a42c0cd1a5348c2331b93bc89d3501" Dec 16 08:29:41 crc kubenswrapper[4823]: I1216 08:29:41.822264 4823 scope.go:117] "RemoveContainer" containerID="35a5c10289a24251766ad494f45574ff5a5a0665f721d33c6bce0236a1cc605a" Dec 16 08:29:41 crc kubenswrapper[4823]: I1216 08:29:41.878140 4823 scope.go:117] "RemoveContainer" containerID="2e61a365077d500ce96adc2dcbd2a95f956d9d90cb8939352b76adcce5111901" Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.134561 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.135570 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.135658 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.136780 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.136859 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" gracePeriod=600 Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.368702 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" exitCode=0 Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.368741 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f"} Dec 16 08:29:58 crc kubenswrapper[4823]: I1216 08:29:58.368800 4823 scope.go:117] "RemoveContainer" containerID="f47c5c93d0c43e11fad6aac3bb24214b551760b1e7ba2c6d9838c87d916c84c0" Dec 16 08:29:58 crc kubenswrapper[4823]: E1216 08:29:58.798927 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:29:59 crc kubenswrapper[4823]: I1216 08:29:59.392471 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:29:59 crc kubenswrapper[4823]: E1216 08:29:59.392703 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.152806 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6"] Dec 16 08:30:00 crc kubenswrapper[4823]: E1216 08:30:00.153167 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="registry-server" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.153187 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="registry-server" Dec 16 08:30:00 crc kubenswrapper[4823]: E1216 08:30:00.153210 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="extract-content" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.153218 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="extract-content" Dec 16 08:30:00 crc kubenswrapper[4823]: E1216 08:30:00.153235 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="extract-utilities" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.153245 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="extract-utilities" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.153408 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6abf0d3-3653-48c9-8d52-cca2e2f97545" containerName="registry-server" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.154073 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.158243 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.158350 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.161394 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6"] Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.233468 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49999519-6bb6-47bf-aa92-985e7d038b0c-secret-volume\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.233822 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rwg\" (UniqueName: \"kubernetes.io/projected/49999519-6bb6-47bf-aa92-985e7d038b0c-kube-api-access-44rwg\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.233913 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49999519-6bb6-47bf-aa92-985e7d038b0c-config-volume\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.334905 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rwg\" (UniqueName: \"kubernetes.io/projected/49999519-6bb6-47bf-aa92-985e7d038b0c-kube-api-access-44rwg\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.334986 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49999519-6bb6-47bf-aa92-985e7d038b0c-config-volume\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.335053 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49999519-6bb6-47bf-aa92-985e7d038b0c-secret-volume\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.336705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49999519-6bb6-47bf-aa92-985e7d038b0c-config-volume\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.345117 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49999519-6bb6-47bf-aa92-985e7d038b0c-secret-volume\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.364946 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rwg\" (UniqueName: \"kubernetes.io/projected/49999519-6bb6-47bf-aa92-985e7d038b0c-kube-api-access-44rwg\") pod \"collect-profiles-29431230-txxf6\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.507849 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:00 crc kubenswrapper[4823]: I1216 08:30:00.763000 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6"] Dec 16 08:30:01 crc kubenswrapper[4823]: I1216 08:30:01.409937 4823 generic.go:334] "Generic (PLEG): container finished" podID="49999519-6bb6-47bf-aa92-985e7d038b0c" containerID="883456fb5995de0292ae9252c5a92710d3629466700f6e999e7eecdce8b31a78" exitCode=0 Dec 16 08:30:01 crc kubenswrapper[4823]: I1216 08:30:01.409988 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" event={"ID":"49999519-6bb6-47bf-aa92-985e7d038b0c","Type":"ContainerDied","Data":"883456fb5995de0292ae9252c5a92710d3629466700f6e999e7eecdce8b31a78"} Dec 16 08:30:01 crc kubenswrapper[4823]: I1216 08:30:01.410243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" event={"ID":"49999519-6bb6-47bf-aa92-985e7d038b0c","Type":"ContainerStarted","Data":"a0638a1cc3790f5c2cf68c2c16059937e39ca5ca4bf2de6c68d7047edc8bb93f"} Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.757541 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.869625 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49999519-6bb6-47bf-aa92-985e7d038b0c-config-volume\") pod \"49999519-6bb6-47bf-aa92-985e7d038b0c\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.869756 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49999519-6bb6-47bf-aa92-985e7d038b0c-secret-volume\") pod \"49999519-6bb6-47bf-aa92-985e7d038b0c\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.869807 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rwg\" (UniqueName: \"kubernetes.io/projected/49999519-6bb6-47bf-aa92-985e7d038b0c-kube-api-access-44rwg\") pod \"49999519-6bb6-47bf-aa92-985e7d038b0c\" (UID: \"49999519-6bb6-47bf-aa92-985e7d038b0c\") " Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.870624 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49999519-6bb6-47bf-aa92-985e7d038b0c-config-volume" (OuterVolumeSpecName: "config-volume") pod "49999519-6bb6-47bf-aa92-985e7d038b0c" (UID: "49999519-6bb6-47bf-aa92-985e7d038b0c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.875218 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49999519-6bb6-47bf-aa92-985e7d038b0c-kube-api-access-44rwg" (OuterVolumeSpecName: "kube-api-access-44rwg") pod "49999519-6bb6-47bf-aa92-985e7d038b0c" (UID: "49999519-6bb6-47bf-aa92-985e7d038b0c"). InnerVolumeSpecName "kube-api-access-44rwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.875760 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49999519-6bb6-47bf-aa92-985e7d038b0c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "49999519-6bb6-47bf-aa92-985e7d038b0c" (UID: "49999519-6bb6-47bf-aa92-985e7d038b0c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.971534 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49999519-6bb6-47bf-aa92-985e7d038b0c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.971590 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49999519-6bb6-47bf-aa92-985e7d038b0c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:02 crc kubenswrapper[4823]: I1216 08:30:02.971602 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rwg\" (UniqueName: \"kubernetes.io/projected/49999519-6bb6-47bf-aa92-985e7d038b0c-kube-api-access-44rwg\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:03 crc kubenswrapper[4823]: I1216 08:30:03.425188 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" event={"ID":"49999519-6bb6-47bf-aa92-985e7d038b0c","Type":"ContainerDied","Data":"a0638a1cc3790f5c2cf68c2c16059937e39ca5ca4bf2de6c68d7047edc8bb93f"} Dec 16 08:30:03 crc kubenswrapper[4823]: I1216 08:30:03.425239 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0638a1cc3790f5c2cf68c2c16059937e39ca5ca4bf2de6c68d7047edc8bb93f" Dec 16 08:30:03 crc kubenswrapper[4823]: I1216 08:30:03.425246 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6" Dec 16 08:30:03 crc kubenswrapper[4823]: I1216 08:30:03.836074 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv"] Dec 16 08:30:03 crc kubenswrapper[4823]: I1216 08:30:03.840930 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-zw7gv"] Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.474253 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kbzcw"] Dec 16 08:30:05 crc kubenswrapper[4823]: E1216 08:30:05.474570 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49999519-6bb6-47bf-aa92-985e7d038b0c" containerName="collect-profiles" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.474586 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="49999519-6bb6-47bf-aa92-985e7d038b0c" containerName="collect-profiles" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.474776 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="49999519-6bb6-47bf-aa92-985e7d038b0c" containerName="collect-profiles" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.475728 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.487257 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbzcw"] Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.506400 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-catalog-content\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.506465 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5676z\" (UniqueName: \"kubernetes.io/projected/279510a6-9725-42c3-80fe-72ab8cae39ab-kube-api-access-5676z\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.506548 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-utilities\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.608235 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-utilities\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.608348 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-catalog-content\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.608381 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5676z\" (UniqueName: \"kubernetes.io/projected/279510a6-9725-42c3-80fe-72ab8cae39ab-kube-api-access-5676z\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.609190 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-utilities\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.609426 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-catalog-content\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.645194 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5676z\" (UniqueName: \"kubernetes.io/projected/279510a6-9725-42c3-80fe-72ab8cae39ab-kube-api-access-5676z\") pod \"community-operators-kbzcw\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.780519 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd17a84c-a98c-4f2b-ae73-32d888461931" path="/var/lib/kubelet/pods/bd17a84c-a98c-4f2b-ae73-32d888461931/volumes" Dec 16 08:30:05 crc kubenswrapper[4823]: I1216 08:30:05.794865 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:06 crc kubenswrapper[4823]: I1216 08:30:06.054491 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kbzcw"] Dec 16 08:30:06 crc kubenswrapper[4823]: I1216 08:30:06.443665 4823 generic.go:334] "Generic (PLEG): container finished" podID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerID="b46d0cf94319feb560f160f1d0ce111634f8dfb9546cdf9398a6d3fc068e65cc" exitCode=0 Dec 16 08:30:06 crc kubenswrapper[4823]: I1216 08:30:06.443779 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzcw" event={"ID":"279510a6-9725-42c3-80fe-72ab8cae39ab","Type":"ContainerDied","Data":"b46d0cf94319feb560f160f1d0ce111634f8dfb9546cdf9398a6d3fc068e65cc"} Dec 16 08:30:06 crc kubenswrapper[4823]: I1216 08:30:06.444473 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzcw" event={"ID":"279510a6-9725-42c3-80fe-72ab8cae39ab","Type":"ContainerStarted","Data":"9560127ab69c81426f1fd501f0b02169c3b54f08dc17422a9274faec87b884eb"} Dec 16 08:30:06 crc kubenswrapper[4823]: I1216 08:30:06.445668 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:30:08 crc kubenswrapper[4823]: I1216 08:30:08.461873 4823 generic.go:334] "Generic (PLEG): container finished" podID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerID="7302828ef1dba928278d44a2b3709a626ab96ef30e8014c977e81ffa6af25011" exitCode=0 Dec 16 08:30:08 crc kubenswrapper[4823]: I1216 08:30:08.462012 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzcw" event={"ID":"279510a6-9725-42c3-80fe-72ab8cae39ab","Type":"ContainerDied","Data":"7302828ef1dba928278d44a2b3709a626ab96ef30e8014c977e81ffa6af25011"} Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.272172 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lmzfz"] Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.274136 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.288901 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lmzfz"] Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.395552 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm6m5\" (UniqueName: \"kubernetes.io/projected/f6cec7b3-d6e9-424b-a325-a20a862db466-kube-api-access-wm6m5\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.395621 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-utilities\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.395668 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-catalog-content\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.496592 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm6m5\" (UniqueName: \"kubernetes.io/projected/f6cec7b3-d6e9-424b-a325-a20a862db466-kube-api-access-wm6m5\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.496663 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-utilities\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.496711 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-catalog-content\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.497267 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-catalog-content\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.497389 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-utilities\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.519576 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm6m5\" (UniqueName: \"kubernetes.io/projected/f6cec7b3-d6e9-424b-a325-a20a862db466-kube-api-access-wm6m5\") pod \"certified-operators-lmzfz\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:09 crc kubenswrapper[4823]: I1216 08:30:09.606809 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:10 crc kubenswrapper[4823]: I1216 08:30:10.080170 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lmzfz"] Dec 16 08:30:10 crc kubenswrapper[4823]: W1216 08:30:10.088305 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6cec7b3_d6e9_424b_a325_a20a862db466.slice/crio-145f818e5a76eefdea8e4ba0c112c024f50798abee8cba732637e953a7bc7b2d WatchSource:0}: Error finding container 145f818e5a76eefdea8e4ba0c112c024f50798abee8cba732637e953a7bc7b2d: Status 404 returned error can't find the container with id 145f818e5a76eefdea8e4ba0c112c024f50798abee8cba732637e953a7bc7b2d Dec 16 08:30:10 crc kubenswrapper[4823]: I1216 08:30:10.476983 4823 generic.go:334] "Generic (PLEG): container finished" podID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerID="689f6a47c2e732513ae22ac5d69148c7ef2870cf9da2b9c8023d2b1e2862d101" exitCode=0 Dec 16 08:30:10 crc kubenswrapper[4823]: I1216 08:30:10.477126 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmzfz" event={"ID":"f6cec7b3-d6e9-424b-a325-a20a862db466","Type":"ContainerDied","Data":"689f6a47c2e732513ae22ac5d69148c7ef2870cf9da2b9c8023d2b1e2862d101"} Dec 16 08:30:10 crc kubenswrapper[4823]: I1216 08:30:10.477162 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmzfz" event={"ID":"f6cec7b3-d6e9-424b-a325-a20a862db466","Type":"ContainerStarted","Data":"145f818e5a76eefdea8e4ba0c112c024f50798abee8cba732637e953a7bc7b2d"} Dec 16 08:30:10 crc kubenswrapper[4823]: I1216 08:30:10.480848 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzcw" event={"ID":"279510a6-9725-42c3-80fe-72ab8cae39ab","Type":"ContainerStarted","Data":"ca15920c84455045be46e8819678256fe4e89042860eaa585755bfc9a1896bbd"} Dec 16 08:30:10 crc kubenswrapper[4823]: I1216 08:30:10.534555 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kbzcw" podStartSLOduration=1.925535566 podStartE2EDuration="5.534515133s" podCreationTimestamp="2025-12-16 08:30:05 +0000 UTC" firstStartedPulling="2025-12-16 08:30:06.445409429 +0000 UTC m=+5684.933975552" lastFinishedPulling="2025-12-16 08:30:10.054388996 +0000 UTC m=+5688.542955119" observedRunningTime="2025-12-16 08:30:10.528295107 +0000 UTC m=+5689.016861240" watchObservedRunningTime="2025-12-16 08:30:10.534515133 +0000 UTC m=+5689.023081246" Dec 16 08:30:11 crc kubenswrapper[4823]: I1216 08:30:11.775999 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:30:11 crc kubenswrapper[4823]: E1216 08:30:11.776648 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:30:13 crc kubenswrapper[4823]: I1216 08:30:13.499994 4823 generic.go:334] "Generic (PLEG): container finished" podID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerID="940e752040a2967d8e043891c54a565c36d6278166f84311160d699b70e98938" exitCode=0 Dec 16 08:30:13 crc kubenswrapper[4823]: I1216 08:30:13.500293 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmzfz" event={"ID":"f6cec7b3-d6e9-424b-a325-a20a862db466","Type":"ContainerDied","Data":"940e752040a2967d8e043891c54a565c36d6278166f84311160d699b70e98938"} Dec 16 08:30:15 crc kubenswrapper[4823]: I1216 08:30:15.795358 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:15 crc kubenswrapper[4823]: I1216 08:30:15.796578 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:15 crc kubenswrapper[4823]: I1216 08:30:15.837692 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:16 crc kubenswrapper[4823]: I1216 08:30:16.525116 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmzfz" event={"ID":"f6cec7b3-d6e9-424b-a325-a20a862db466","Type":"ContainerStarted","Data":"5a1352eca2bc328dd6d9d4c779b6f6edea45ac4e5e0ca6257d587482cfcba5b5"} Dec 16 08:30:16 crc kubenswrapper[4823]: I1216 08:30:16.546569 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lmzfz" podStartSLOduration=2.232314563 podStartE2EDuration="7.546551662s" podCreationTimestamp="2025-12-16 08:30:09 +0000 UTC" firstStartedPulling="2025-12-16 08:30:10.478714464 +0000 UTC m=+5688.967280587" lastFinishedPulling="2025-12-16 08:30:15.792951563 +0000 UTC m=+5694.281517686" observedRunningTime="2025-12-16 08:30:16.54522875 +0000 UTC m=+5695.033794873" watchObservedRunningTime="2025-12-16 08:30:16.546551662 +0000 UTC m=+5695.035117785" Dec 16 08:30:16 crc kubenswrapper[4823]: I1216 08:30:16.566977 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:17 crc kubenswrapper[4823]: I1216 08:30:17.882598 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbzcw"] Dec 16 08:30:18 crc kubenswrapper[4823]: I1216 08:30:18.540797 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kbzcw" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="registry-server" containerID="cri-o://ca15920c84455045be46e8819678256fe4e89042860eaa585755bfc9a1896bbd" gracePeriod=2 Dec 16 08:30:19 crc kubenswrapper[4823]: I1216 08:30:19.607201 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:19 crc kubenswrapper[4823]: I1216 08:30:19.607492 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:19 crc kubenswrapper[4823]: I1216 08:30:19.643172 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.557396 4823 generic.go:334] "Generic (PLEG): container finished" podID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerID="ca15920c84455045be46e8819678256fe4e89042860eaa585755bfc9a1896bbd" exitCode=0 Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.557449 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzcw" event={"ID":"279510a6-9725-42c3-80fe-72ab8cae39ab","Type":"ContainerDied","Data":"ca15920c84455045be46e8819678256fe4e89042860eaa585755bfc9a1896bbd"} Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.860252 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.982150 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-catalog-content\") pod \"279510a6-9725-42c3-80fe-72ab8cae39ab\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.982221 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-utilities\") pod \"279510a6-9725-42c3-80fe-72ab8cae39ab\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.982248 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5676z\" (UniqueName: \"kubernetes.io/projected/279510a6-9725-42c3-80fe-72ab8cae39ab-kube-api-access-5676z\") pod \"279510a6-9725-42c3-80fe-72ab8cae39ab\" (UID: \"279510a6-9725-42c3-80fe-72ab8cae39ab\") " Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.983871 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-utilities" (OuterVolumeSpecName: "utilities") pod "279510a6-9725-42c3-80fe-72ab8cae39ab" (UID: "279510a6-9725-42c3-80fe-72ab8cae39ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:30:20 crc kubenswrapper[4823]: I1216 08:30:20.994353 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279510a6-9725-42c3-80fe-72ab8cae39ab-kube-api-access-5676z" (OuterVolumeSpecName: "kube-api-access-5676z") pod "279510a6-9725-42c3-80fe-72ab8cae39ab" (UID: "279510a6-9725-42c3-80fe-72ab8cae39ab"). InnerVolumeSpecName "kube-api-access-5676z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.042533 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "279510a6-9725-42c3-80fe-72ab8cae39ab" (UID: "279510a6-9725-42c3-80fe-72ab8cae39ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.084145 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.084190 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279510a6-9725-42c3-80fe-72ab8cae39ab-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.084204 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5676z\" (UniqueName: \"kubernetes.io/projected/279510a6-9725-42c3-80fe-72ab8cae39ab-kube-api-access-5676z\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.565967 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kbzcw" event={"ID":"279510a6-9725-42c3-80fe-72ab8cae39ab","Type":"ContainerDied","Data":"9560127ab69c81426f1fd501f0b02169c3b54f08dc17422a9274faec87b884eb"} Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.566050 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kbzcw" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.566281 4823 scope.go:117] "RemoveContainer" containerID="ca15920c84455045be46e8819678256fe4e89042860eaa585755bfc9a1896bbd" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.585075 4823 scope.go:117] "RemoveContainer" containerID="7302828ef1dba928278d44a2b3709a626ab96ef30e8014c977e81ffa6af25011" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.604989 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kbzcw"] Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.612180 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kbzcw"] Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.620675 4823 scope.go:117] "RemoveContainer" containerID="b46d0cf94319feb560f160f1d0ce111634f8dfb9546cdf9398a6d3fc068e65cc" Dec 16 08:30:21 crc kubenswrapper[4823]: I1216 08:30:21.786364 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" path="/var/lib/kubelet/pods/279510a6-9725-42c3-80fe-72ab8cae39ab/volumes" Dec 16 08:30:25 crc kubenswrapper[4823]: I1216 08:30:25.772296 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:30:25 crc kubenswrapper[4823]: E1216 08:30:25.772730 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:30:29 crc kubenswrapper[4823]: I1216 08:30:29.671479 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:29 crc kubenswrapper[4823]: I1216 08:30:29.725907 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lmzfz"] Dec 16 08:30:30 crc kubenswrapper[4823]: I1216 08:30:30.626856 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lmzfz" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="registry-server" containerID="cri-o://5a1352eca2bc328dd6d9d4c779b6f6edea45ac4e5e0ca6257d587482cfcba5b5" gracePeriod=2 Dec 16 08:30:31 crc kubenswrapper[4823]: I1216 08:30:31.638550 4823 generic.go:334] "Generic (PLEG): container finished" podID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerID="5a1352eca2bc328dd6d9d4c779b6f6edea45ac4e5e0ca6257d587482cfcba5b5" exitCode=0 Dec 16 08:30:31 crc kubenswrapper[4823]: I1216 08:30:31.638604 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmzfz" event={"ID":"f6cec7b3-d6e9-424b-a325-a20a862db466","Type":"ContainerDied","Data":"5a1352eca2bc328dd6d9d4c779b6f6edea45ac4e5e0ca6257d587482cfcba5b5"} Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.208521 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.343857 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-utilities\") pod \"f6cec7b3-d6e9-424b-a325-a20a862db466\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.343929 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-catalog-content\") pod \"f6cec7b3-d6e9-424b-a325-a20a862db466\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.343957 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm6m5\" (UniqueName: \"kubernetes.io/projected/f6cec7b3-d6e9-424b-a325-a20a862db466-kube-api-access-wm6m5\") pod \"f6cec7b3-d6e9-424b-a325-a20a862db466\" (UID: \"f6cec7b3-d6e9-424b-a325-a20a862db466\") " Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.344738 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-utilities" (OuterVolumeSpecName: "utilities") pod "f6cec7b3-d6e9-424b-a325-a20a862db466" (UID: "f6cec7b3-d6e9-424b-a325-a20a862db466"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.347009 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.348978 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cec7b3-d6e9-424b-a325-a20a862db466-kube-api-access-wm6m5" (OuterVolumeSpecName: "kube-api-access-wm6m5") pod "f6cec7b3-d6e9-424b-a325-a20a862db466" (UID: "f6cec7b3-d6e9-424b-a325-a20a862db466"). InnerVolumeSpecName "kube-api-access-wm6m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.398750 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6cec7b3-d6e9-424b-a325-a20a862db466" (UID: "f6cec7b3-d6e9-424b-a325-a20a862db466"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.448163 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6cec7b3-d6e9-424b-a325-a20a862db466-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.448210 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm6m5\" (UniqueName: \"kubernetes.io/projected/f6cec7b3-d6e9-424b-a325-a20a862db466-kube-api-access-wm6m5\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.647662 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmzfz" event={"ID":"f6cec7b3-d6e9-424b-a325-a20a862db466","Type":"ContainerDied","Data":"145f818e5a76eefdea8e4ba0c112c024f50798abee8cba732637e953a7bc7b2d"} Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.647723 4823 scope.go:117] "RemoveContainer" containerID="5a1352eca2bc328dd6d9d4c779b6f6edea45ac4e5e0ca6257d587482cfcba5b5" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.647748 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmzfz" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.667214 4823 scope.go:117] "RemoveContainer" containerID="940e752040a2967d8e043891c54a565c36d6278166f84311160d699b70e98938" Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.681339 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lmzfz"] Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.685175 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lmzfz"] Dec 16 08:30:32 crc kubenswrapper[4823]: I1216 08:30:32.702823 4823 scope.go:117] "RemoveContainer" containerID="689f6a47c2e732513ae22ac5d69148c7ef2870cf9da2b9c8023d2b1e2862d101" Dec 16 08:30:33 crc kubenswrapper[4823]: I1216 08:30:33.779980 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" path="/var/lib/kubelet/pods/f6cec7b3-d6e9-424b-a325-a20a862db466/volumes" Dec 16 08:30:36 crc kubenswrapper[4823]: I1216 08:30:36.772499 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:30:36 crc kubenswrapper[4823]: E1216 08:30:36.773172 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:30:41 crc kubenswrapper[4823]: I1216 08:30:41.998442 4823 scope.go:117] "RemoveContainer" containerID="970f3a5ca4f8d0fd63741d179774e18a0052bce79bbb50353d9ebc7979c37ff3" Dec 16 08:30:51 crc kubenswrapper[4823]: I1216 08:30:51.774831 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:30:51 crc kubenswrapper[4823]: E1216 08:30:51.775758 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:31:04 crc kubenswrapper[4823]: I1216 08:31:04.772637 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:31:04 crc kubenswrapper[4823]: E1216 08:31:04.775337 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:31:17 crc kubenswrapper[4823]: I1216 08:31:17.771839 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:31:17 crc kubenswrapper[4823]: E1216 08:31:17.772763 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:31:31 crc kubenswrapper[4823]: I1216 08:31:31.777915 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:31:31 crc kubenswrapper[4823]: E1216 08:31:31.779257 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:31:43 crc kubenswrapper[4823]: I1216 08:31:43.772165 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:31:43 crc kubenswrapper[4823]: E1216 08:31:43.773194 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:31:57 crc kubenswrapper[4823]: I1216 08:31:57.771502 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:31:57 crc kubenswrapper[4823]: E1216 08:31:57.772357 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:32:08 crc kubenswrapper[4823]: I1216 08:32:08.771107 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:32:08 crc kubenswrapper[4823]: E1216 08:32:08.771813 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:32:20 crc kubenswrapper[4823]: I1216 08:32:20.771632 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:32:20 crc kubenswrapper[4823]: E1216 08:32:20.772458 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:32:32 crc kubenswrapper[4823]: I1216 08:32:32.772151 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:32:32 crc kubenswrapper[4823]: E1216 08:32:32.772996 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:32:43 crc kubenswrapper[4823]: I1216 08:32:43.772515 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:32:43 crc kubenswrapper[4823]: E1216 08:32:43.773383 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:32:55 crc kubenswrapper[4823]: I1216 08:32:55.771532 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:32:55 crc kubenswrapper[4823]: E1216 08:32:55.772367 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:33:10 crc kubenswrapper[4823]: I1216 08:33:10.771275 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:33:10 crc kubenswrapper[4823]: E1216 08:33:10.772251 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:33:21 crc kubenswrapper[4823]: I1216 08:33:21.783331 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:33:21 crc kubenswrapper[4823]: E1216 08:33:21.784422 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:33:33 crc kubenswrapper[4823]: I1216 08:33:33.772227 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:33:33 crc kubenswrapper[4823]: E1216 08:33:33.773493 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:33:44 crc kubenswrapper[4823]: I1216 08:33:44.771580 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:33:44 crc kubenswrapper[4823]: E1216 08:33:44.772406 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:33:59 crc kubenswrapper[4823]: I1216 08:33:59.772631 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:33:59 crc kubenswrapper[4823]: E1216 08:33:59.773517 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:34:10 crc kubenswrapper[4823]: I1216 08:34:10.772331 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:34:10 crc kubenswrapper[4823]: E1216 08:34:10.773504 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:34:24 crc kubenswrapper[4823]: I1216 08:34:24.772520 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:34:24 crc kubenswrapper[4823]: E1216 08:34:24.773233 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:34:35 crc kubenswrapper[4823]: I1216 08:34:35.771675 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:34:35 crc kubenswrapper[4823]: E1216 08:34:35.772652 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:34:47 crc kubenswrapper[4823]: I1216 08:34:47.773387 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:34:47 crc kubenswrapper[4823]: E1216 08:34:47.773849 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.371811 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wnwwv"] Dec 16 08:34:56 crc kubenswrapper[4823]: E1216 08:34:56.373508 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="extract-utilities" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373554 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="extract-utilities" Dec 16 08:34:56 crc kubenswrapper[4823]: E1216 08:34:56.373587 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="registry-server" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373597 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="registry-server" Dec 16 08:34:56 crc kubenswrapper[4823]: E1216 08:34:56.373611 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="registry-server" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373620 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="registry-server" Dec 16 08:34:56 crc kubenswrapper[4823]: E1216 08:34:56.373634 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="extract-utilities" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373642 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="extract-utilities" Dec 16 08:34:56 crc kubenswrapper[4823]: E1216 08:34:56.373663 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="extract-content" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373672 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="extract-content" Dec 16 08:34:56 crc kubenswrapper[4823]: E1216 08:34:56.373687 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="extract-content" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373696 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="extract-content" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373894 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="279510a6-9725-42c3-80fe-72ab8cae39ab" containerName="registry-server" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.373921 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6cec7b3-d6e9-424b-a325-a20a862db466" containerName="registry-server" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.375270 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.392885 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wnwwv"] Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.418882 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-catalog-content\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.419349 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-848vb\" (UniqueName: \"kubernetes.io/projected/2409525c-4705-4456-bc51-468da283b0ca-kube-api-access-848vb\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.419546 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-utilities\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.520506 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-utilities\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.520619 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-catalog-content\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.520652 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-848vb\" (UniqueName: \"kubernetes.io/projected/2409525c-4705-4456-bc51-468da283b0ca-kube-api-access-848vb\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.521177 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-utilities\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.521423 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-catalog-content\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.544217 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-848vb\" (UniqueName: \"kubernetes.io/projected/2409525c-4705-4456-bc51-468da283b0ca-kube-api-access-848vb\") pod \"redhat-operators-wnwwv\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:56 crc kubenswrapper[4823]: I1216 08:34:56.727377 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:34:57 crc kubenswrapper[4823]: I1216 08:34:57.168277 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wnwwv"] Dec 16 08:34:57 crc kubenswrapper[4823]: I1216 08:34:57.643528 4823 generic.go:334] "Generic (PLEG): container finished" podID="2409525c-4705-4456-bc51-468da283b0ca" containerID="45b893947021eb5805d0081f5740c608003f679b3817edb185ca9f783b857273" exitCode=0 Dec 16 08:34:57 crc kubenswrapper[4823]: I1216 08:34:57.643609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerDied","Data":"45b893947021eb5805d0081f5740c608003f679b3817edb185ca9f783b857273"} Dec 16 08:34:57 crc kubenswrapper[4823]: I1216 08:34:57.643637 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerStarted","Data":"0656cbfb8f0a0e1c6bcf7c02d18ed83feb9184b985204c105dcb5edeb3141361"} Dec 16 08:34:58 crc kubenswrapper[4823]: I1216 08:34:58.668939 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerStarted","Data":"c9cf39f8ddfd3644732c8ad2ba78b7c0da4b5f4492835d50caf2d412087c5e43"} Dec 16 08:34:59 crc kubenswrapper[4823]: I1216 08:34:59.678434 4823 generic.go:334] "Generic (PLEG): container finished" podID="2409525c-4705-4456-bc51-468da283b0ca" containerID="c9cf39f8ddfd3644732c8ad2ba78b7c0da4b5f4492835d50caf2d412087c5e43" exitCode=0 Dec 16 08:34:59 crc kubenswrapper[4823]: I1216 08:34:59.678500 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerDied","Data":"c9cf39f8ddfd3644732c8ad2ba78b7c0da4b5f4492835d50caf2d412087c5e43"} Dec 16 08:35:00 crc kubenswrapper[4823]: I1216 08:35:00.688714 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerStarted","Data":"4fafd57c38690ffd9d05a188e52354f839021f46b8d9df1ded25fab2843665f7"} Dec 16 08:35:00 crc kubenswrapper[4823]: I1216 08:35:00.718947 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wnwwv" podStartSLOduration=2.2223098119999998 podStartE2EDuration="4.718927357s" podCreationTimestamp="2025-12-16 08:34:56 +0000 UTC" firstStartedPulling="2025-12-16 08:34:57.645867286 +0000 UTC m=+5976.134433409" lastFinishedPulling="2025-12-16 08:35:00.142484831 +0000 UTC m=+5978.631050954" observedRunningTime="2025-12-16 08:35:00.712692931 +0000 UTC m=+5979.201259094" watchObservedRunningTime="2025-12-16 08:35:00.718927357 +0000 UTC m=+5979.207493470" Dec 16 08:35:02 crc kubenswrapper[4823]: I1216 08:35:02.771058 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:35:03 crc kubenswrapper[4823]: I1216 08:35:03.710670 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"294b6c0f0228f2baf018382a3d963ffc8968e2a2b2dcfcf14ae472b2ce45e535"} Dec 16 08:35:06 crc kubenswrapper[4823]: I1216 08:35:06.728404 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:35:06 crc kubenswrapper[4823]: I1216 08:35:06.728786 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:35:06 crc kubenswrapper[4823]: I1216 08:35:06.774283 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:35:07 crc kubenswrapper[4823]: I1216 08:35:07.786109 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:35:07 crc kubenswrapper[4823]: I1216 08:35:07.834127 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wnwwv"] Dec 16 08:35:09 crc kubenswrapper[4823]: I1216 08:35:09.748201 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wnwwv" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="registry-server" containerID="cri-o://4fafd57c38690ffd9d05a188e52354f839021f46b8d9df1ded25fab2843665f7" gracePeriod=2 Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.767153 4823 generic.go:334] "Generic (PLEG): container finished" podID="2409525c-4705-4456-bc51-468da283b0ca" containerID="4fafd57c38690ffd9d05a188e52354f839021f46b8d9df1ded25fab2843665f7" exitCode=0 Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.767201 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerDied","Data":"4fafd57c38690ffd9d05a188e52354f839021f46b8d9df1ded25fab2843665f7"} Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.927468 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.961985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-utilities\") pod \"2409525c-4705-4456-bc51-468da283b0ca\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.962098 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-catalog-content\") pod \"2409525c-4705-4456-bc51-468da283b0ca\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.962121 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-848vb\" (UniqueName: \"kubernetes.io/projected/2409525c-4705-4456-bc51-468da283b0ca-kube-api-access-848vb\") pod \"2409525c-4705-4456-bc51-468da283b0ca\" (UID: \"2409525c-4705-4456-bc51-468da283b0ca\") " Dec 16 08:35:11 crc kubenswrapper[4823]: I1216 08:35:11.987544 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-utilities" (OuterVolumeSpecName: "utilities") pod "2409525c-4705-4456-bc51-468da283b0ca" (UID: "2409525c-4705-4456-bc51-468da283b0ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.008631 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2409525c-4705-4456-bc51-468da283b0ca-kube-api-access-848vb" (OuterVolumeSpecName: "kube-api-access-848vb") pod "2409525c-4705-4456-bc51-468da283b0ca" (UID: "2409525c-4705-4456-bc51-468da283b0ca"). InnerVolumeSpecName "kube-api-access-848vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.063865 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-848vb\" (UniqueName: \"kubernetes.io/projected/2409525c-4705-4456-bc51-468da283b0ca-kube-api-access-848vb\") on node \"crc\" DevicePath \"\"" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.063894 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.106833 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2409525c-4705-4456-bc51-468da283b0ca" (UID: "2409525c-4705-4456-bc51-468da283b0ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.165396 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2409525c-4705-4456-bc51-468da283b0ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.777098 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wnwwv" event={"ID":"2409525c-4705-4456-bc51-468da283b0ca","Type":"ContainerDied","Data":"0656cbfb8f0a0e1c6bcf7c02d18ed83feb9184b985204c105dcb5edeb3141361"} Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.777182 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wnwwv" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.777403 4823 scope.go:117] "RemoveContainer" containerID="4fafd57c38690ffd9d05a188e52354f839021f46b8d9df1ded25fab2843665f7" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.818385 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wnwwv"] Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.819973 4823 scope.go:117] "RemoveContainer" containerID="c9cf39f8ddfd3644732c8ad2ba78b7c0da4b5f4492835d50caf2d412087c5e43" Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.827721 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wnwwv"] Dec 16 08:35:12 crc kubenswrapper[4823]: I1216 08:35:12.846354 4823 scope.go:117] "RemoveContainer" containerID="45b893947021eb5805d0081f5740c608003f679b3817edb185ca9f783b857273" Dec 16 08:35:13 crc kubenswrapper[4823]: I1216 08:35:13.780366 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2409525c-4705-4456-bc51-468da283b0ca" path="/var/lib/kubelet/pods/2409525c-4705-4456-bc51-468da283b0ca/volumes" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.629292 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-7vbh4"] Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.635644 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-7vbh4"] Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.753626 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-khcxk"] Dec 16 08:36:06 crc kubenswrapper[4823]: E1216 08:36:06.753962 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="extract-content" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.753977 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="extract-content" Dec 16 08:36:06 crc kubenswrapper[4823]: E1216 08:36:06.753997 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="registry-server" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.754005 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="registry-server" Dec 16 08:36:06 crc kubenswrapper[4823]: E1216 08:36:06.754046 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="extract-utilities" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.754054 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="extract-utilities" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.754216 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2409525c-4705-4456-bc51-468da283b0ca" containerName="registry-server" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.754769 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.759150 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.759224 4823 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-7fjh7" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.759947 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.760200 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.765183 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-khcxk"] Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.869052 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db8p7\" (UniqueName: \"kubernetes.io/projected/a769b117-48a9-436c-84f2-16890ff2edef-kube-api-access-db8p7\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.869118 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a769b117-48a9-436c-84f2-16890ff2edef-crc-storage\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.869153 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a769b117-48a9-436c-84f2-16890ff2edef-node-mnt\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.970006 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db8p7\" (UniqueName: \"kubernetes.io/projected/a769b117-48a9-436c-84f2-16890ff2edef-kube-api-access-db8p7\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.970081 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a769b117-48a9-436c-84f2-16890ff2edef-crc-storage\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.970110 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a769b117-48a9-436c-84f2-16890ff2edef-node-mnt\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.970502 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a769b117-48a9-436c-84f2-16890ff2edef-node-mnt\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.971322 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a769b117-48a9-436c-84f2-16890ff2edef-crc-storage\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:06 crc kubenswrapper[4823]: I1216 08:36:06.991129 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db8p7\" (UniqueName: \"kubernetes.io/projected/a769b117-48a9-436c-84f2-16890ff2edef-kube-api-access-db8p7\") pod \"crc-storage-crc-khcxk\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:07 crc kubenswrapper[4823]: I1216 08:36:07.081112 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:07 crc kubenswrapper[4823]: I1216 08:36:07.560326 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-khcxk"] Dec 16 08:36:07 crc kubenswrapper[4823]: I1216 08:36:07.569582 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:36:07 crc kubenswrapper[4823]: I1216 08:36:07.783262 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55889c97-d986-4a35-bbcf-af45ac5f9fe8" path="/var/lib/kubelet/pods/55889c97-d986-4a35-bbcf-af45ac5f9fe8/volumes" Dec 16 08:36:08 crc kubenswrapper[4823]: I1216 08:36:08.205337 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khcxk" event={"ID":"a769b117-48a9-436c-84f2-16890ff2edef","Type":"ContainerStarted","Data":"ec96b59340209bb5b02a095e3abe8db6ce97b427b25493b90bda9b7c998757b3"} Dec 16 08:36:09 crc kubenswrapper[4823]: I1216 08:36:09.218218 4823 generic.go:334] "Generic (PLEG): container finished" podID="a769b117-48a9-436c-84f2-16890ff2edef" containerID="f550381959f87d979ba90f524766f86db98bf913dc564358a20b897de0ae2be3" exitCode=0 Dec 16 08:36:09 crc kubenswrapper[4823]: I1216 08:36:09.218303 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khcxk" event={"ID":"a769b117-48a9-436c-84f2-16890ff2edef","Type":"ContainerDied","Data":"f550381959f87d979ba90f524766f86db98bf913dc564358a20b897de0ae2be3"} Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.502320 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.622768 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a769b117-48a9-436c-84f2-16890ff2edef-crc-storage\") pod \"a769b117-48a9-436c-84f2-16890ff2edef\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.622826 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db8p7\" (UniqueName: \"kubernetes.io/projected/a769b117-48a9-436c-84f2-16890ff2edef-kube-api-access-db8p7\") pod \"a769b117-48a9-436c-84f2-16890ff2edef\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.622906 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a769b117-48a9-436c-84f2-16890ff2edef-node-mnt\") pod \"a769b117-48a9-436c-84f2-16890ff2edef\" (UID: \"a769b117-48a9-436c-84f2-16890ff2edef\") " Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.623273 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a769b117-48a9-436c-84f2-16890ff2edef-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a769b117-48a9-436c-84f2-16890ff2edef" (UID: "a769b117-48a9-436c-84f2-16890ff2edef"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.623953 4823 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a769b117-48a9-436c-84f2-16890ff2edef-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.630915 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a769b117-48a9-436c-84f2-16890ff2edef-kube-api-access-db8p7" (OuterVolumeSpecName: "kube-api-access-db8p7") pod "a769b117-48a9-436c-84f2-16890ff2edef" (UID: "a769b117-48a9-436c-84f2-16890ff2edef"). InnerVolumeSpecName "kube-api-access-db8p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.660462 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a769b117-48a9-436c-84f2-16890ff2edef-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a769b117-48a9-436c-84f2-16890ff2edef" (UID: "a769b117-48a9-436c-84f2-16890ff2edef"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.725150 4823 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a769b117-48a9-436c-84f2-16890ff2edef-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:10 crc kubenswrapper[4823]: I1216 08:36:10.725195 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db8p7\" (UniqueName: \"kubernetes.io/projected/a769b117-48a9-436c-84f2-16890ff2edef-kube-api-access-db8p7\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:11 crc kubenswrapper[4823]: I1216 08:36:11.235712 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-khcxk" event={"ID":"a769b117-48a9-436c-84f2-16890ff2edef","Type":"ContainerDied","Data":"ec96b59340209bb5b02a095e3abe8db6ce97b427b25493b90bda9b7c998757b3"} Dec 16 08:36:11 crc kubenswrapper[4823]: I1216 08:36:11.235767 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec96b59340209bb5b02a095e3abe8db6ce97b427b25493b90bda9b7c998757b3" Dec 16 08:36:11 crc kubenswrapper[4823]: I1216 08:36:11.236384 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-khcxk" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.598288 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-khcxk"] Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.605601 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-khcxk"] Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.724537 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tr4pl"] Dec 16 08:36:12 crc kubenswrapper[4823]: E1216 08:36:12.724877 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a769b117-48a9-436c-84f2-16890ff2edef" containerName="storage" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.724902 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a769b117-48a9-436c-84f2-16890ff2edef" containerName="storage" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.725118 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a769b117-48a9-436c-84f2-16890ff2edef" containerName="storage" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.725712 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.727881 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.727924 4823 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-7fjh7" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.728313 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.732169 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.735627 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tr4pl"] Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.756806 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26d62dfc-cd58-4d9e-a82f-39265298d5dc-node-mnt\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.756861 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5m9l\" (UniqueName: \"kubernetes.io/projected/26d62dfc-cd58-4d9e-a82f-39265298d5dc-kube-api-access-t5m9l\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.756889 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26d62dfc-cd58-4d9e-a82f-39265298d5dc-crc-storage\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.857554 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5m9l\" (UniqueName: \"kubernetes.io/projected/26d62dfc-cd58-4d9e-a82f-39265298d5dc-kube-api-access-t5m9l\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.857606 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26d62dfc-cd58-4d9e-a82f-39265298d5dc-crc-storage\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.857743 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26d62dfc-cd58-4d9e-a82f-39265298d5dc-node-mnt\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.859123 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26d62dfc-cd58-4d9e-a82f-39265298d5dc-node-mnt\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.859549 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26d62dfc-cd58-4d9e-a82f-39265298d5dc-crc-storage\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:12 crc kubenswrapper[4823]: I1216 08:36:12.875858 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5m9l\" (UniqueName: \"kubernetes.io/projected/26d62dfc-cd58-4d9e-a82f-39265298d5dc-kube-api-access-t5m9l\") pod \"crc-storage-crc-tr4pl\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:13 crc kubenswrapper[4823]: I1216 08:36:13.044585 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:13 crc kubenswrapper[4823]: I1216 08:36:13.268031 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tr4pl"] Dec 16 08:36:13 crc kubenswrapper[4823]: I1216 08:36:13.788322 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a769b117-48a9-436c-84f2-16890ff2edef" path="/var/lib/kubelet/pods/a769b117-48a9-436c-84f2-16890ff2edef/volumes" Dec 16 08:36:14 crc kubenswrapper[4823]: I1216 08:36:14.263916 4823 generic.go:334] "Generic (PLEG): container finished" podID="26d62dfc-cd58-4d9e-a82f-39265298d5dc" containerID="df524718121af9a8b91ec7dc595130b8bab526d5aae3c52d1313a733c56e01a8" exitCode=0 Dec 16 08:36:14 crc kubenswrapper[4823]: I1216 08:36:14.263991 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tr4pl" event={"ID":"26d62dfc-cd58-4d9e-a82f-39265298d5dc","Type":"ContainerDied","Data":"df524718121af9a8b91ec7dc595130b8bab526d5aae3c52d1313a733c56e01a8"} Dec 16 08:36:14 crc kubenswrapper[4823]: I1216 08:36:14.264036 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tr4pl" event={"ID":"26d62dfc-cd58-4d9e-a82f-39265298d5dc","Type":"ContainerStarted","Data":"9bc5d6146d281f847238654b1558bda459da3fd1c1f1141e331234493abda153"} Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.736685 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.814762 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26d62dfc-cd58-4d9e-a82f-39265298d5dc-node-mnt\") pod \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.814846 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26d62dfc-cd58-4d9e-a82f-39265298d5dc-crc-storage\") pod \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.814884 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5m9l\" (UniqueName: \"kubernetes.io/projected/26d62dfc-cd58-4d9e-a82f-39265298d5dc-kube-api-access-t5m9l\") pod \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\" (UID: \"26d62dfc-cd58-4d9e-a82f-39265298d5dc\") " Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.815184 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26d62dfc-cd58-4d9e-a82f-39265298d5dc-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "26d62dfc-cd58-4d9e-a82f-39265298d5dc" (UID: "26d62dfc-cd58-4d9e-a82f-39265298d5dc"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.818772 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d62dfc-cd58-4d9e-a82f-39265298d5dc-kube-api-access-t5m9l" (OuterVolumeSpecName: "kube-api-access-t5m9l") pod "26d62dfc-cd58-4d9e-a82f-39265298d5dc" (UID: "26d62dfc-cd58-4d9e-a82f-39265298d5dc"). InnerVolumeSpecName "kube-api-access-t5m9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.833413 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d62dfc-cd58-4d9e-a82f-39265298d5dc-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "26d62dfc-cd58-4d9e-a82f-39265298d5dc" (UID: "26d62dfc-cd58-4d9e-a82f-39265298d5dc"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.916963 4823 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/26d62dfc-cd58-4d9e-a82f-39265298d5dc-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.917016 4823 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/26d62dfc-cd58-4d9e-a82f-39265298d5dc-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:15 crc kubenswrapper[4823]: I1216 08:36:15.917033 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5m9l\" (UniqueName: \"kubernetes.io/projected/26d62dfc-cd58-4d9e-a82f-39265298d5dc-kube-api-access-t5m9l\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:16 crc kubenswrapper[4823]: I1216 08:36:16.279979 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tr4pl" event={"ID":"26d62dfc-cd58-4d9e-a82f-39265298d5dc","Type":"ContainerDied","Data":"9bc5d6146d281f847238654b1558bda459da3fd1c1f1141e331234493abda153"} Dec 16 08:36:16 crc kubenswrapper[4823]: I1216 08:36:16.280056 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc5d6146d281f847238654b1558bda459da3fd1c1f1141e331234493abda153" Dec 16 08:36:16 crc kubenswrapper[4823]: I1216 08:36:16.280089 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tr4pl" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.099594 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h2dn4"] Dec 16 08:36:24 crc kubenswrapper[4823]: E1216 08:36:24.100463 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d62dfc-cd58-4d9e-a82f-39265298d5dc" containerName="storage" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.100478 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d62dfc-cd58-4d9e-a82f-39265298d5dc" containerName="storage" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.100661 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d62dfc-cd58-4d9e-a82f-39265298d5dc" containerName="storage" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.101749 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.111335 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2dn4"] Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.136112 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjx9\" (UniqueName: \"kubernetes.io/projected/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-kube-api-access-hqjx9\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.136179 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-catalog-content\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.136250 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-utilities\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.237086 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjx9\" (UniqueName: \"kubernetes.io/projected/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-kube-api-access-hqjx9\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.237520 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-catalog-content\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.238124 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-catalog-content\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.238278 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-utilities\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.238577 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-utilities\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.254841 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjx9\" (UniqueName: \"kubernetes.io/projected/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-kube-api-access-hqjx9\") pod \"redhat-marketplace-h2dn4\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.418875 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:24 crc kubenswrapper[4823]: I1216 08:36:24.880195 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2dn4"] Dec 16 08:36:25 crc kubenswrapper[4823]: I1216 08:36:25.346773 4823 generic.go:334] "Generic (PLEG): container finished" podID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerID="a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0" exitCode=0 Dec 16 08:36:25 crc kubenswrapper[4823]: I1216 08:36:25.346856 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerDied","Data":"a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0"} Dec 16 08:36:25 crc kubenswrapper[4823]: I1216 08:36:25.346896 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerStarted","Data":"450b3ab2be4f378c1c52972857ea5a3eec054aa2065dde020822763640d70060"} Dec 16 08:36:26 crc kubenswrapper[4823]: I1216 08:36:26.355313 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerStarted","Data":"8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1"} Dec 16 08:36:27 crc kubenswrapper[4823]: I1216 08:36:27.362877 4823 generic.go:334] "Generic (PLEG): container finished" podID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerID="8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1" exitCode=0 Dec 16 08:36:27 crc kubenswrapper[4823]: I1216 08:36:27.362920 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerDied","Data":"8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1"} Dec 16 08:36:28 crc kubenswrapper[4823]: I1216 08:36:28.373588 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerStarted","Data":"23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663"} Dec 16 08:36:28 crc kubenswrapper[4823]: I1216 08:36:28.394737 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h2dn4" podStartSLOduration=1.90041405 podStartE2EDuration="4.394723213s" podCreationTimestamp="2025-12-16 08:36:24 +0000 UTC" firstStartedPulling="2025-12-16 08:36:25.3493349 +0000 UTC m=+6063.837901023" lastFinishedPulling="2025-12-16 08:36:27.843644073 +0000 UTC m=+6066.332210186" observedRunningTime="2025-12-16 08:36:28.393529986 +0000 UTC m=+6066.882096109" watchObservedRunningTime="2025-12-16 08:36:28.394723213 +0000 UTC m=+6066.883289336" Dec 16 08:36:34 crc kubenswrapper[4823]: I1216 08:36:34.419763 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:34 crc kubenswrapper[4823]: I1216 08:36:34.421471 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:34 crc kubenswrapper[4823]: I1216 08:36:34.471813 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:35 crc kubenswrapper[4823]: I1216 08:36:35.485765 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:35 crc kubenswrapper[4823]: I1216 08:36:35.540920 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2dn4"] Dec 16 08:36:37 crc kubenswrapper[4823]: I1216 08:36:37.449238 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h2dn4" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="registry-server" containerID="cri-o://23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663" gracePeriod=2 Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.346220 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.440199 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-catalog-content\") pod \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.440362 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqjx9\" (UniqueName: \"kubernetes.io/projected/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-kube-api-access-hqjx9\") pod \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.440508 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-utilities\") pod \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\" (UID: \"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d\") " Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.442363 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-utilities" (OuterVolumeSpecName: "utilities") pod "50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" (UID: "50ec25aa-a0d1-42b5-9e1e-31ba44c9261d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.451744 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-kube-api-access-hqjx9" (OuterVolumeSpecName: "kube-api-access-hqjx9") pod "50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" (UID: "50ec25aa-a0d1-42b5-9e1e-31ba44c9261d"). InnerVolumeSpecName "kube-api-access-hqjx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.460421 4823 generic.go:334] "Generic (PLEG): container finished" podID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerID="23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663" exitCode=0 Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.460491 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerDied","Data":"23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663"} Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.460533 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2dn4" event={"ID":"50ec25aa-a0d1-42b5-9e1e-31ba44c9261d","Type":"ContainerDied","Data":"450b3ab2be4f378c1c52972857ea5a3eec054aa2065dde020822763640d70060"} Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.460556 4823 scope.go:117] "RemoveContainer" containerID="23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.460715 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2dn4" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.487789 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" (UID: "50ec25aa-a0d1-42b5-9e1e-31ba44c9261d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.494723 4823 scope.go:117] "RemoveContainer" containerID="8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.509277 4823 scope.go:117] "RemoveContainer" containerID="a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.531347 4823 scope.go:117] "RemoveContainer" containerID="23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663" Dec 16 08:36:38 crc kubenswrapper[4823]: E1216 08:36:38.531727 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663\": container with ID starting with 23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663 not found: ID does not exist" containerID="23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.531761 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663"} err="failed to get container status \"23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663\": rpc error: code = NotFound desc = could not find container \"23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663\": container with ID starting with 23515da0416a098678b4f075765b9fe33733038976419f9900867b00c0495663 not found: ID does not exist" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.531781 4823 scope.go:117] "RemoveContainer" containerID="8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1" Dec 16 08:36:38 crc kubenswrapper[4823]: E1216 08:36:38.532035 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1\": container with ID starting with 8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1 not found: ID does not exist" containerID="8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.532055 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1"} err="failed to get container status \"8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1\": rpc error: code = NotFound desc = could not find container \"8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1\": container with ID starting with 8469ac59bc39283c6128b5f2f092c042d3a03b14976d70043794bb179ad013d1 not found: ID does not exist" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.532074 4823 scope.go:117] "RemoveContainer" containerID="a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0" Dec 16 08:36:38 crc kubenswrapper[4823]: E1216 08:36:38.532281 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0\": container with ID starting with a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0 not found: ID does not exist" containerID="a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.532316 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0"} err="failed to get container status \"a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0\": rpc error: code = NotFound desc = could not find container \"a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0\": container with ID starting with a2bb1d4b0a16f4f15b3c12ac52800ed657ad076b08d0c45e389e8244a0240ed0 not found: ID does not exist" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.542771 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.542796 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqjx9\" (UniqueName: \"kubernetes.io/projected/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-kube-api-access-hqjx9\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.542807 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.789762 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2dn4"] Dec 16 08:36:38 crc kubenswrapper[4823]: I1216 08:36:38.794777 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2dn4"] Dec 16 08:36:39 crc kubenswrapper[4823]: I1216 08:36:39.781296 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" path="/var/lib/kubelet/pods/50ec25aa-a0d1-42b5-9e1e-31ba44c9261d/volumes" Dec 16 08:36:42 crc kubenswrapper[4823]: I1216 08:36:42.144230 4823 scope.go:117] "RemoveContainer" containerID="63017cb666caabb666f002c8b5a3b7bf50de129e0f932a0af8fdbeda724d4412" Dec 16 08:37:28 crc kubenswrapper[4823]: I1216 08:37:28.134383 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:37:28 crc kubenswrapper[4823]: I1216 08:37:28.134906 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:37:58 crc kubenswrapper[4823]: I1216 08:37:58.133738 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:37:58 crc kubenswrapper[4823]: I1216 08:37:58.134299 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.943825 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d854b9c6f-5xkzl"] Dec 16 08:38:23 crc kubenswrapper[4823]: E1216 08:38:23.944738 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="registry-server" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.944758 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="registry-server" Dec 16 08:38:23 crc kubenswrapper[4823]: E1216 08:38:23.944780 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="extract-utilities" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.944788 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="extract-utilities" Dec 16 08:38:23 crc kubenswrapper[4823]: E1216 08:38:23.944797 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="extract-content" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.944805 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="extract-content" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.945004 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ec25aa-a0d1-42b5-9e1e-31ba44c9261d" containerName="registry-server" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.945904 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.947724 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wrp69" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.947962 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.947967 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.947967 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.960448 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d854b9c6f-5xkzl"] Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.961159 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw2hz\" (UniqueName: \"kubernetes.io/projected/334290ef-5ba0-4690-be5b-768d703e9610-kube-api-access-zw2hz\") pod \"dnsmasq-dns-6d854b9c6f-5xkzl\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.961280 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334290ef-5ba0-4690-be5b-768d703e9610-config\") pod \"dnsmasq-dns-6d854b9c6f-5xkzl\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.982293 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcfb4c6f9-txnts"] Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.983823 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:23 crc kubenswrapper[4823]: I1216 08:38:23.989631 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.002772 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcfb4c6f9-txnts"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.062665 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334290ef-5ba0-4690-be5b-768d703e9610-config\") pod \"dnsmasq-dns-6d854b9c6f-5xkzl\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.062774 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-config\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.062824 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfm9q\" (UniqueName: \"kubernetes.io/projected/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-kube-api-access-qfm9q\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.062856 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw2hz\" (UniqueName: \"kubernetes.io/projected/334290ef-5ba0-4690-be5b-768d703e9610-kube-api-access-zw2hz\") pod \"dnsmasq-dns-6d854b9c6f-5xkzl\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.063039 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-dns-svc\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.063758 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334290ef-5ba0-4690-be5b-768d703e9610-config\") pod \"dnsmasq-dns-6d854b9c6f-5xkzl\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.095288 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw2hz\" (UniqueName: \"kubernetes.io/projected/334290ef-5ba0-4690-be5b-768d703e9610-kube-api-access-zw2hz\") pod \"dnsmasq-dns-6d854b9c6f-5xkzl\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.164688 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-config\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.165128 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfm9q\" (UniqueName: \"kubernetes.io/projected/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-kube-api-access-qfm9q\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.165195 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-dns-svc\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.165650 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-config\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.166102 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-dns-svc\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.188223 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfm9q\" (UniqueName: \"kubernetes.io/projected/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-kube-api-access-qfm9q\") pod \"dnsmasq-dns-6fcfb4c6f9-txnts\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.256452 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d854b9c6f-5xkzl"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.256898 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.284257 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-rtjsh"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.285328 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.308477 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-rtjsh"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.312942 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.367964 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdw7\" (UniqueName: \"kubernetes.io/projected/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-kube-api-access-csdw7\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.368042 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-config\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.368140 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-dns-svc\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.469642 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-dns-svc\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.469721 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdw7\" (UniqueName: \"kubernetes.io/projected/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-kube-api-access-csdw7\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.469753 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-config\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.470651 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-config\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.470781 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-dns-svc\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.495494 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdw7\" (UniqueName: \"kubernetes.io/projected/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-kube-api-access-csdw7\") pod \"dnsmasq-dns-6fdf89db6c-rtjsh\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.573757 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcfb4c6f9-txnts"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.609185 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57484c487-k9rhm"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.612616 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.621193 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57484c487-k9rhm"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.650014 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.673259 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-config\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.673322 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl5gs\" (UniqueName: \"kubernetes.io/projected/e0565c2a-5f25-4b81-b27d-764c8dc154b9-kube-api-access-pl5gs\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.673347 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-dns-svc\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.775179 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl5gs\" (UniqueName: \"kubernetes.io/projected/e0565c2a-5f25-4b81-b27d-764c8dc154b9-kube-api-access-pl5gs\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.775242 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-dns-svc\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.775358 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-config\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.776435 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-config\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.776478 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-dns-svc\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.794395 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl5gs\" (UniqueName: \"kubernetes.io/projected/e0565c2a-5f25-4b81-b27d-764c8dc154b9-kube-api-access-pl5gs\") pod \"dnsmasq-dns-57484c487-k9rhm\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.822144 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d854b9c6f-5xkzl"] Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.934154 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:24 crc kubenswrapper[4823]: I1216 08:38:24.938890 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcfb4c6f9-txnts"] Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.159243 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-rtjsh"] Dec 16 08:38:25 crc kubenswrapper[4823]: W1216 08:38:25.167345 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1ab111_67ba_4e81_8879_e8c1d0b00b1f.slice/crio-f6a3e0753f6d55fcad970786e29cdde2dbbe538bee1258633a63b872f7c08bd9 WatchSource:0}: Error finding container f6a3e0753f6d55fcad970786e29cdde2dbbe538bee1258633a63b872f7c08bd9: Status 404 returned error can't find the container with id f6a3e0753f6d55fcad970786e29cdde2dbbe538bee1258633a63b872f7c08bd9 Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.335039 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" event={"ID":"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027","Type":"ContainerStarted","Data":"5b1e290bac0d7f3201d3e15862229e0e24b35f2e2c44b1917ff4077de0cddfc2"} Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.337074 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" event={"ID":"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f","Type":"ContainerStarted","Data":"f6a3e0753f6d55fcad970786e29cdde2dbbe538bee1258633a63b872f7c08bd9"} Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.343007 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" event={"ID":"334290ef-5ba0-4690-be5b-768d703e9610","Type":"ContainerStarted","Data":"44e9e2f2877fea0f7ff204c447e05dd5bf672164b277c8cd32371156b633b606"} Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.362525 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57484c487-k9rhm"] Dec 16 08:38:25 crc kubenswrapper[4823]: W1216 08:38:25.371437 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0565c2a_5f25_4b81_b27d_764c8dc154b9.slice/crio-e4a9b591a6c465261ac6151c96e3ec2bb782a0f62ae85bd2f9bb87b568533b5c WatchSource:0}: Error finding container e4a9b591a6c465261ac6151c96e3ec2bb782a0f62ae85bd2f9bb87b568533b5c: Status 404 returned error can't find the container with id e4a9b591a6c465261ac6151c96e3ec2bb782a0f62ae85bd2f9bb87b568533b5c Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.435520 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.436911 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.443233 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.443501 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.449447 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.449468 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.449521 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.449632 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.449730 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2twnw" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.476015 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.591862 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.591941 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c541c676-03e3-4756-bbfa-770b1d9c3712-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592092 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z9jx\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-kube-api-access-8z9jx\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592139 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592241 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592276 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c541c676-03e3-4756-bbfa-770b1d9c3712-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592299 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592494 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592576 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592621 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-config-data\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.592655 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693679 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693733 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c541c676-03e3-4756-bbfa-770b1d9c3712-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693769 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z9jx\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-kube-api-access-8z9jx\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693852 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693869 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693896 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c541c676-03e3-4756-bbfa-770b1d9c3712-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693920 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.693968 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.694000 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.694024 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-config-data\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.694059 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.694480 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.699348 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c541c676-03e3-4756-bbfa-770b1d9c3712-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.700250 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-config-data\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.700293 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.700713 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.702721 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.702758 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/32117fdd5ecf1abacaad99c35b93f20179735b8b5200ad830b36f632cf8604dd/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.703150 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.706368 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.707051 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c541c676-03e3-4756-bbfa-770b1d9c3712-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.703516 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.739534 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z9jx\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-kube-api-access-8z9jx\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.749246 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.759412 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.767813 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.767932 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.770196 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.770439 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.771658 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.772629 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lz7bq" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.772995 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.773383 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.773867 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.790938 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.896811 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.896914 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dd01190-adb1-4224-9a3d-b6da96bad2e8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.896979 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897005 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897072 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897096 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dd01190-adb1-4224-9a3d-b6da96bad2e8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897136 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897158 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897187 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897204 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:25 crc kubenswrapper[4823]: I1216 08:38:25.897226 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkdq\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-kube-api-access-crkdq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999114 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999479 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dd01190-adb1-4224-9a3d-b6da96bad2e8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999529 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999561 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999606 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999633 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999658 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkdq\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-kube-api-access-crkdq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999733 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999757 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dd01190-adb1-4224-9a3d-b6da96bad2e8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999794 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:25.999822 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.000884 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.001230 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.004207 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.004250 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/425127be2ca6cf7b0df5a079ff52c2fb95106fbffb7f1066620281ae7e3b9826/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.005637 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.006396 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.007367 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.007381 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dd01190-adb1-4224-9a3d-b6da96bad2e8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.008224 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.008327 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.008415 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dd01190-adb1-4224-9a3d-b6da96bad2e8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.019972 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkdq\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-kube-api-access-crkdq\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.043703 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.099158 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.285620 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:38:26 crc kubenswrapper[4823]: W1216 08:38:26.312379 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc541c676_03e3_4756_bbfa_770b1d9c3712.slice/crio-5fc71f18a11c235f26a8e442f2496563cd0c3f45fb0682805164e91f1e68c3a0 WatchSource:0}: Error finding container 5fc71f18a11c235f26a8e442f2496563cd0c3f45fb0682805164e91f1e68c3a0: Status 404 returned error can't find the container with id 5fc71f18a11c235f26a8e442f2496563cd0c3f45fb0682805164e91f1e68c3a0 Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.356694 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c541c676-03e3-4756-bbfa-770b1d9c3712","Type":"ContainerStarted","Data":"5fc71f18a11c235f26a8e442f2496563cd0c3f45fb0682805164e91f1e68c3a0"} Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.358984 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-k9rhm" event={"ID":"e0565c2a-5f25-4b81-b27d-764c8dc154b9","Type":"ContainerStarted","Data":"e4a9b591a6c465261ac6151c96e3ec2bb782a0f62ae85bd2f9bb87b568533b5c"} Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.551856 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.554552 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.557365 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.558841 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8t746" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.559015 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.562129 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.566045 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.567142 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.602215 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:38:26 crc kubenswrapper[4823]: W1216 08:38:26.610823 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd01190_adb1_4224_9a3d_b6da96bad2e8.slice/crio-518afd86843d9b668ac6fd907073b9fd7c373d81f6d2cda667c8ff4335710d21 WatchSource:0}: Error finding container 518afd86843d9b668ac6fd907073b9fd7c373d81f6d2cda667c8ff4335710d21: Status 404 returned error can't find the container with id 518afd86843d9b668ac6fd907073b9fd7c373d81f6d2cda667c8ff4335710d21 Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.712455 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713227 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713257 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713344 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713391 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713506 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713573 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg96c\" (UniqueName: \"kubernetes.io/projected/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kube-api-access-dg96c\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.713608 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.815755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.815922 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.815951 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.815989 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.816018 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.816073 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.816091 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg96c\" (UniqueName: \"kubernetes.io/projected/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kube-api-access-dg96c\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.816110 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.817843 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.818129 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.818975 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.818992 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.819004 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d5e2d850ff8ae3c39a56a7cb665d319ce687a3ca07eb0eac587904544aa8dfe3/globalmount\"" pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.820748 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.824768 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.827741 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.832884 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg96c\" (UniqueName: \"kubernetes.io/projected/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kube-api-access-dg96c\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.856166 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") pod \"openstack-galera-0\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " pod="openstack/openstack-galera-0" Dec 16 08:38:26 crc kubenswrapper[4823]: I1216 08:38:26.881849 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 08:38:27 crc kubenswrapper[4823]: I1216 08:38:27.379314 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dd01190-adb1-4224-9a3d-b6da96bad2e8","Type":"ContainerStarted","Data":"518afd86843d9b668ac6fd907073b9fd7c373d81f6d2cda667c8ff4335710d21"} Dec 16 08:38:27 crc kubenswrapper[4823]: I1216 08:38:27.466133 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 08:38:27 crc kubenswrapper[4823]: W1216 08:38:27.480277 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e511eaa_334a_4fe3_ab41_e66d4a53a931.slice/crio-d917a2b413c2c2bec575fd904721d3c821e8a3e96601e3f4ffbbefdd622d935e WatchSource:0}: Error finding container d917a2b413c2c2bec575fd904721d3c821e8a3e96601e3f4ffbbefdd622d935e: Status 404 returned error can't find the container with id d917a2b413c2c2bec575fd904721d3c821e8a3e96601e3f4ffbbefdd622d935e Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.128636 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.130219 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.133138 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.133324 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.133875 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.133983 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cf8lt" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.135150 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.135181 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.135216 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.135758 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"294b6c0f0228f2baf018382a3d963ffc8968e2a2b2dcfcf14ae472b2ce45e535"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.135802 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://294b6c0f0228f2baf018382a3d963ffc8968e2a2b2dcfcf14ae472b2ce45e535" gracePeriod=600 Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.137247 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285010 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285417 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285470 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285504 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2td\" (UniqueName: \"kubernetes.io/projected/4496b25e-2f39-453a-aa60-ffa74e9913c8-kube-api-access-jk2td\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285543 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285567 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285611 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.285638 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.391990 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.392843 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.392918 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393418 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393587 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393640 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2td\" (UniqueName: \"kubernetes.io/projected/4496b25e-2f39-453a-aa60-ffa74e9913c8-kube-api-access-jk2td\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393683 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393700 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.393781 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.394278 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.395369 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.409894 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.418334 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.454733 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e511eaa-334a-4fe3-ab41-e66d4a53a931","Type":"ContainerStarted","Data":"d917a2b413c2c2bec575fd904721d3c821e8a3e96601e3f4ffbbefdd622d935e"} Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.459694 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="294b6c0f0228f2baf018382a3d963ffc8968e2a2b2dcfcf14ae472b2ce45e535" exitCode=0 Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.460107 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"294b6c0f0228f2baf018382a3d963ffc8968e2a2b2dcfcf14ae472b2ce45e535"} Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.460556 4823 scope.go:117] "RemoveContainer" containerID="1b09ffdd594a8d5cacbf74adcd72b8a090c1db4bd077e6e653ce5a0feae3c64f" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.491687 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2td\" (UniqueName: \"kubernetes.io/projected/4496b25e-2f39-453a-aa60-ffa74e9913c8-kube-api-access-jk2td\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.506600 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.506647 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9d88b77a62429723c32a2466cba129d7afb953a1cede41d2f9e4a475ead3ef09/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.605996 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") pod \"openstack-cell1-galera-0\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.658815 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.661548 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.668536 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-456nc" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.668859 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.669025 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.675374 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.801421 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmmw\" (UniqueName: \"kubernetes.io/projected/c90aab28-60fa-4cdc-a89a-bd041351015d-kube-api-access-glmmw\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.801475 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.801506 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.801550 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-kolla-config\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.801723 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-config-data\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.805532 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.902806 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-config-data\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.902893 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmmw\" (UniqueName: \"kubernetes.io/projected/c90aab28-60fa-4cdc-a89a-bd041351015d-kube-api-access-glmmw\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.902931 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.902957 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.902979 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-kolla-config\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.903749 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-kolla-config\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.903764 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-config-data\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.907678 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.913612 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:28 crc kubenswrapper[4823]: I1216 08:38:28.923677 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmmw\" (UniqueName: \"kubernetes.io/projected/c90aab28-60fa-4cdc-a89a-bd041351015d-kube-api-access-glmmw\") pod \"memcached-0\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " pod="openstack/memcached-0" Dec 16 08:38:29 crc kubenswrapper[4823]: I1216 08:38:29.003792 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 08:38:30 crc kubenswrapper[4823]: I1216 08:38:29.356901 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 08:38:30 crc kubenswrapper[4823]: I1216 08:38:29.478420 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4"} Dec 16 08:38:30 crc kubenswrapper[4823]: I1216 08:38:29.482630 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4496b25e-2f39-453a-aa60-ffa74e9913c8","Type":"ContainerStarted","Data":"e767172d18a6ab67f2f2cb7e2024f66f103a1e4e96f08e1c6664458e1ff73f1b"} Dec 16 08:38:30 crc kubenswrapper[4823]: I1216 08:38:30.995127 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 08:38:31 crc kubenswrapper[4823]: I1216 08:38:31.508451 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c90aab28-60fa-4cdc-a89a-bd041351015d","Type":"ContainerStarted","Data":"fe735a54def89feb2b97130a9f912fa092f9f8ec78518b1aa8976785e19034f8"} Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.888100 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.890104 4823 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.890327 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647h57bh695h68dh54fhf5hc5h67h5d4hb6h696h685h54ch6h599h5c5h679h74h689h644h5c8h64ch555h5c6h5dh569h698h59fh66ch57bh5b9hb7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zw2hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d854b9c6f-5xkzl_openstack(334290ef-5ba0-4690-be5b-768d703e9610): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.892191 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" podUID="334290ef-5ba0-4690-be5b-768d703e9610" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.964147 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.964200 4823 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.964307 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfm9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcfb4c6f9-txnts_openstack(7e3f5eef-3838-48ea-af3e-a0d4b5e0d027): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:38:52 crc kubenswrapper[4823]: E1216 08:38:52.965741 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" podUID="7e3f5eef-3838-48ea-af3e-a0d4b5e0d027" Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.685235 4823 generic.go:334] "Generic (PLEG): container finished" podID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerID="c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537" exitCode=0 Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.685349 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" event={"ID":"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f","Type":"ContainerDied","Data":"c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537"} Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.688121 4823 generic.go:334] "Generic (PLEG): container finished" podID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerID="1ef96a6489a70f037006955f18c093f97b851e5f52bcc2976737b4bac159c91b" exitCode=0 Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.688228 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-k9rhm" event={"ID":"e0565c2a-5f25-4b81-b27d-764c8dc154b9","Type":"ContainerDied","Data":"1ef96a6489a70f037006955f18c093f97b851e5f52bcc2976737b4bac159c91b"} Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.690902 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4496b25e-2f39-453a-aa60-ffa74e9913c8","Type":"ContainerStarted","Data":"8ad69e3090748c1e799f5496828a84c316a66aa70069a402b82eec113ca2e82a"} Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.693132 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e511eaa-334a-4fe3-ab41-e66d4a53a931","Type":"ContainerStarted","Data":"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46"} Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.696900 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c90aab28-60fa-4cdc-a89a-bd041351015d","Type":"ContainerStarted","Data":"f09c828395889bafb8967722bc9fe10e42c34bbfc0a893f1c82cb91b42750c4b"} Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.697167 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 08:38:53 crc kubenswrapper[4823]: I1216 08:38:53.755666 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.940137472 podStartE2EDuration="25.755643897s" podCreationTimestamp="2025-12-16 08:38:28 +0000 UTC" firstStartedPulling="2025-12-16 08:38:31.003001922 +0000 UTC m=+6189.491568045" lastFinishedPulling="2025-12-16 08:38:52.818508347 +0000 UTC m=+6211.307074470" observedRunningTime="2025-12-16 08:38:53.747165991 +0000 UTC m=+6212.235732124" watchObservedRunningTime="2025-12-16 08:38:53.755643897 +0000 UTC m=+6212.244210020" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.076837 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.082404 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.180738 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfm9q\" (UniqueName: \"kubernetes.io/projected/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-kube-api-access-qfm9q\") pod \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.180806 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334290ef-5ba0-4690-be5b-768d703e9610-config\") pod \"334290ef-5ba0-4690-be5b-768d703e9610\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.180859 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-config\") pod \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.180917 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-dns-svc\") pod \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\" (UID: \"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027\") " Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.180955 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw2hz\" (UniqueName: \"kubernetes.io/projected/334290ef-5ba0-4690-be5b-768d703e9610-kube-api-access-zw2hz\") pod \"334290ef-5ba0-4690-be5b-768d703e9610\" (UID: \"334290ef-5ba0-4690-be5b-768d703e9610\") " Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.181442 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-config" (OuterVolumeSpecName: "config") pod "7e3f5eef-3838-48ea-af3e-a0d4b5e0d027" (UID: "7e3f5eef-3838-48ea-af3e-a0d4b5e0d027"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.181459 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e3f5eef-3838-48ea-af3e-a0d4b5e0d027" (UID: "7e3f5eef-3838-48ea-af3e-a0d4b5e0d027"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.181754 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/334290ef-5ba0-4690-be5b-768d703e9610-config" (OuterVolumeSpecName: "config") pod "334290ef-5ba0-4690-be5b-768d703e9610" (UID: "334290ef-5ba0-4690-be5b-768d703e9610"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.184221 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334290ef-5ba0-4690-be5b-768d703e9610-kube-api-access-zw2hz" (OuterVolumeSpecName: "kube-api-access-zw2hz") pod "334290ef-5ba0-4690-be5b-768d703e9610" (UID: "334290ef-5ba0-4690-be5b-768d703e9610"). InnerVolumeSpecName "kube-api-access-zw2hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.184324 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-kube-api-access-qfm9q" (OuterVolumeSpecName: "kube-api-access-qfm9q") pod "7e3f5eef-3838-48ea-af3e-a0d4b5e0d027" (UID: "7e3f5eef-3838-48ea-af3e-a0d4b5e0d027"). InnerVolumeSpecName "kube-api-access-qfm9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.282731 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.282793 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw2hz\" (UniqueName: \"kubernetes.io/projected/334290ef-5ba0-4690-be5b-768d703e9610-kube-api-access-zw2hz\") on node \"crc\" DevicePath \"\"" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.282807 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfm9q\" (UniqueName: \"kubernetes.io/projected/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-kube-api-access-qfm9q\") on node \"crc\" DevicePath \"\"" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.282817 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334290ef-5ba0-4690-be5b-768d703e9610-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.282827 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.704653 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dd01190-adb1-4224-9a3d-b6da96bad2e8","Type":"ContainerStarted","Data":"8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585"} Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.706665 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-k9rhm" event={"ID":"e0565c2a-5f25-4b81-b27d-764c8dc154b9","Type":"ContainerStarted","Data":"bd090d7e4221f56585efd589621a160fafa31d2c93e35507533a0931abdec9a3"} Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.706817 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.707691 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.707710 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d854b9c6f-5xkzl" event={"ID":"334290ef-5ba0-4690-be5b-768d703e9610","Type":"ContainerDied","Data":"44e9e2f2877fea0f7ff204c447e05dd5bf672164b277c8cd32371156b633b606"} Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.725739 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.725729 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcfb4c6f9-txnts" event={"ID":"7e3f5eef-3838-48ea-af3e-a0d4b5e0d027","Type":"ContainerDied","Data":"5b1e290bac0d7f3201d3e15862229e0e24b35f2e2c44b1917ff4077de0cddfc2"} Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.731463 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" event={"ID":"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f","Type":"ContainerStarted","Data":"571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa"} Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.732230 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.735606 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c541c676-03e3-4756-bbfa-770b1d9c3712","Type":"ContainerStarted","Data":"dd454651131887859f882a2ddc94dd427d8d7dc13d843dd0887cc643e74061fd"} Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.765504 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57484c487-k9rhm" podStartSLOduration=3.195167603 podStartE2EDuration="30.765487645s" podCreationTimestamp="2025-12-16 08:38:24 +0000 UTC" firstStartedPulling="2025-12-16 08:38:25.377405614 +0000 UTC m=+6183.865971737" lastFinishedPulling="2025-12-16 08:38:52.947725656 +0000 UTC m=+6211.436291779" observedRunningTime="2025-12-16 08:38:54.764481393 +0000 UTC m=+6213.253047526" watchObservedRunningTime="2025-12-16 08:38:54.765487645 +0000 UTC m=+6213.254053768" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.796147 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" podStartSLOduration=2.989195439 podStartE2EDuration="30.796104335s" podCreationTimestamp="2025-12-16 08:38:24 +0000 UTC" firstStartedPulling="2025-12-16 08:38:25.183104635 +0000 UTC m=+6183.671670758" lastFinishedPulling="2025-12-16 08:38:52.990013531 +0000 UTC m=+6211.478579654" observedRunningTime="2025-12-16 08:38:54.790457718 +0000 UTC m=+6213.279023841" watchObservedRunningTime="2025-12-16 08:38:54.796104335 +0000 UTC m=+6213.284670448" Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.908977 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcfb4c6f9-txnts"] Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.916260 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcfb4c6f9-txnts"] Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.945047 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d854b9c6f-5xkzl"] Dec 16 08:38:54 crc kubenswrapper[4823]: I1216 08:38:54.953066 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d854b9c6f-5xkzl"] Dec 16 08:38:55 crc kubenswrapper[4823]: I1216 08:38:55.782391 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334290ef-5ba0-4690-be5b-768d703e9610" path="/var/lib/kubelet/pods/334290ef-5ba0-4690-be5b-768d703e9610/volumes" Dec 16 08:38:55 crc kubenswrapper[4823]: I1216 08:38:55.782750 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3f5eef-3838-48ea-af3e-a0d4b5e0d027" path="/var/lib/kubelet/pods/7e3f5eef-3838-48ea-af3e-a0d4b5e0d027/volumes" Dec 16 08:38:57 crc kubenswrapper[4823]: I1216 08:38:57.761747 4823 generic.go:334] "Generic (PLEG): container finished" podID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerID="8ad69e3090748c1e799f5496828a84c316a66aa70069a402b82eec113ca2e82a" exitCode=0 Dec 16 08:38:57 crc kubenswrapper[4823]: I1216 08:38:57.762098 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4496b25e-2f39-453a-aa60-ffa74e9913c8","Type":"ContainerDied","Data":"8ad69e3090748c1e799f5496828a84c316a66aa70069a402b82eec113ca2e82a"} Dec 16 08:38:57 crc kubenswrapper[4823]: I1216 08:38:57.764368 4823 generic.go:334] "Generic (PLEG): container finished" podID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerID="84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46" exitCode=0 Dec 16 08:38:57 crc kubenswrapper[4823]: I1216 08:38:57.764408 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e511eaa-334a-4fe3-ab41-e66d4a53a931","Type":"ContainerDied","Data":"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46"} Dec 16 08:38:58 crc kubenswrapper[4823]: I1216 08:38:58.773198 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e511eaa-334a-4fe3-ab41-e66d4a53a931","Type":"ContainerStarted","Data":"721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71"} Dec 16 08:38:58 crc kubenswrapper[4823]: I1216 08:38:58.775805 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4496b25e-2f39-453a-aa60-ffa74e9913c8","Type":"ContainerStarted","Data":"8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a"} Dec 16 08:38:58 crc kubenswrapper[4823]: I1216 08:38:58.794145 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.461646466 podStartE2EDuration="33.794128724s" podCreationTimestamp="2025-12-16 08:38:25 +0000 UTC" firstStartedPulling="2025-12-16 08:38:27.486149423 +0000 UTC m=+6185.974715546" lastFinishedPulling="2025-12-16 08:38:52.818631681 +0000 UTC m=+6211.307197804" observedRunningTime="2025-12-16 08:38:58.793652399 +0000 UTC m=+6217.282218542" watchObservedRunningTime="2025-12-16 08:38:58.794128724 +0000 UTC m=+6217.282694847" Dec 16 08:38:58 crc kubenswrapper[4823]: I1216 08:38:58.806287 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:58 crc kubenswrapper[4823]: I1216 08:38:58.806485 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 08:38:58 crc kubenswrapper[4823]: I1216 08:38:58.820937 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.320597248 podStartE2EDuration="31.820908124s" podCreationTimestamp="2025-12-16 08:38:27 +0000 UTC" firstStartedPulling="2025-12-16 08:38:29.378604603 +0000 UTC m=+6187.867170716" lastFinishedPulling="2025-12-16 08:38:52.878915469 +0000 UTC m=+6211.367481592" observedRunningTime="2025-12-16 08:38:58.818851479 +0000 UTC m=+6217.307417602" watchObservedRunningTime="2025-12-16 08:38:58.820908124 +0000 UTC m=+6217.309474277" Dec 16 08:38:59 crc kubenswrapper[4823]: I1216 08:38:59.004780 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 08:38:59 crc kubenswrapper[4823]: I1216 08:38:59.652511 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:38:59 crc kubenswrapper[4823]: I1216 08:38:59.936088 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:38:59 crc kubenswrapper[4823]: I1216 08:38:59.993019 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-rtjsh"] Dec 16 08:38:59 crc kubenswrapper[4823]: I1216 08:38:59.993226 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerName="dnsmasq-dns" containerID="cri-o://571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa" gracePeriod=10 Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.395304 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.501229 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-dns-svc\") pod \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.501364 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-config\") pod \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.501427 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdw7\" (UniqueName: \"kubernetes.io/projected/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-kube-api-access-csdw7\") pod \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\" (UID: \"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f\") " Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.507765 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-kube-api-access-csdw7" (OuterVolumeSpecName: "kube-api-access-csdw7") pod "6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" (UID: "6c1ab111-67ba-4e81-8879-e8c1d0b00b1f"). InnerVolumeSpecName "kube-api-access-csdw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.541971 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" (UID: "6c1ab111-67ba-4e81-8879-e8c1d0b00b1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.549983 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-config" (OuterVolumeSpecName: "config") pod "6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" (UID: "6c1ab111-67ba-4e81-8879-e8c1d0b00b1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.603345 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.603404 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.603418 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdw7\" (UniqueName: \"kubernetes.io/projected/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f-kube-api-access-csdw7\") on node \"crc\" DevicePath \"\"" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.798833 4823 generic.go:334] "Generic (PLEG): container finished" podID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerID="571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa" exitCode=0 Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.798878 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.798968 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" event={"ID":"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f","Type":"ContainerDied","Data":"571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa"} Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.799019 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-rtjsh" event={"ID":"6c1ab111-67ba-4e81-8879-e8c1d0b00b1f","Type":"ContainerDied","Data":"f6a3e0753f6d55fcad970786e29cdde2dbbe538bee1258633a63b872f7c08bd9"} Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.799055 4823 scope.go:117] "RemoveContainer" containerID="571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.827429 4823 scope.go:117] "RemoveContainer" containerID="c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.829703 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-rtjsh"] Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.836231 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-rtjsh"] Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.846535 4823 scope.go:117] "RemoveContainer" containerID="571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa" Dec 16 08:39:00 crc kubenswrapper[4823]: E1216 08:39:00.846988 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa\": container with ID starting with 571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa not found: ID does not exist" containerID="571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.847023 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa"} err="failed to get container status \"571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa\": rpc error: code = NotFound desc = could not find container \"571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa\": container with ID starting with 571def6ff2bdb7d5caa937749d7ab620e93779768ba8ecd5ac16fcce6a0d21aa not found: ID does not exist" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.847067 4823 scope.go:117] "RemoveContainer" containerID="c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537" Dec 16 08:39:00 crc kubenswrapper[4823]: E1216 08:39:00.847420 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537\": container with ID starting with c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537 not found: ID does not exist" containerID="c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537" Dec 16 08:39:00 crc kubenswrapper[4823]: I1216 08:39:00.847461 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537"} err="failed to get container status \"c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537\": rpc error: code = NotFound desc = could not find container \"c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537\": container with ID starting with c384b77549257c2985ba3bd2bd093c58f79094b633967f54f55194e23a611537 not found: ID does not exist" Dec 16 08:39:01 crc kubenswrapper[4823]: I1216 08:39:01.779294 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" path="/var/lib/kubelet/pods/6c1ab111-67ba-4e81-8879-e8c1d0b00b1f/volumes" Dec 16 08:39:06 crc kubenswrapper[4823]: I1216 08:39:06.883082 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 08:39:06 crc kubenswrapper[4823]: I1216 08:39:06.885240 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 08:39:08 crc kubenswrapper[4823]: I1216 08:39:08.531143 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 08:39:08 crc kubenswrapper[4823]: I1216 08:39:08.582591 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 08:39:08 crc kubenswrapper[4823]: I1216 08:39:08.611826 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" probeResult="failure" output=< Dec 16 08:39:08 crc kubenswrapper[4823]: wsrep_local_state_comment (Joined) differs from Synced Dec 16 08:39:08 crc kubenswrapper[4823]: > Dec 16 08:39:08 crc kubenswrapper[4823]: I1216 08:39:08.658449 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 08:39:08 crc kubenswrapper[4823]: I1216 08:39:08.912937 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 08:39:26 crc kubenswrapper[4823]: I1216 08:39:26.007101 4823 generic.go:334] "Generic (PLEG): container finished" podID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerID="8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585" exitCode=0 Dec 16 08:39:26 crc kubenswrapper[4823]: I1216 08:39:26.007240 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dd01190-adb1-4224-9a3d-b6da96bad2e8","Type":"ContainerDied","Data":"8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585"} Dec 16 08:39:27 crc kubenswrapper[4823]: I1216 08:39:27.016622 4823 generic.go:334] "Generic (PLEG): container finished" podID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerID="dd454651131887859f882a2ddc94dd427d8d7dc13d843dd0887cc643e74061fd" exitCode=0 Dec 16 08:39:27 crc kubenswrapper[4823]: I1216 08:39:27.016715 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c541c676-03e3-4756-bbfa-770b1d9c3712","Type":"ContainerDied","Data":"dd454651131887859f882a2ddc94dd427d8d7dc13d843dd0887cc643e74061fd"} Dec 16 08:39:27 crc kubenswrapper[4823]: I1216 08:39:27.024511 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dd01190-adb1-4224-9a3d-b6da96bad2e8","Type":"ContainerStarted","Data":"0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791"} Dec 16 08:39:27 crc kubenswrapper[4823]: I1216 08:39:27.025920 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:39:28 crc kubenswrapper[4823]: I1216 08:39:28.035106 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c541c676-03e3-4756-bbfa-770b1d9c3712","Type":"ContainerStarted","Data":"14c4bd1e581701e98b6e3817b9960d1fa2696f6db0c16fe383c19ac5ad07c985"} Dec 16 08:39:28 crc kubenswrapper[4823]: I1216 08:39:28.036507 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 08:39:28 crc kubenswrapper[4823]: I1216 08:39:28.071625 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.869823887 podStartE2EDuration="1m4.071601248s" podCreationTimestamp="2025-12-16 08:38:24 +0000 UTC" firstStartedPulling="2025-12-16 08:38:26.616750876 +0000 UTC m=+6185.105316999" lastFinishedPulling="2025-12-16 08:38:52.818528237 +0000 UTC m=+6211.307094360" observedRunningTime="2025-12-16 08:39:27.079293039 +0000 UTC m=+6245.567859172" watchObservedRunningTime="2025-12-16 08:39:28.071601248 +0000 UTC m=+6246.560167361" Dec 16 08:39:28 crc kubenswrapper[4823]: I1216 08:39:28.075234 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.510631049 podStartE2EDuration="1m4.075201181s" podCreationTimestamp="2025-12-16 08:38:24 +0000 UTC" firstStartedPulling="2025-12-16 08:38:26.315959028 +0000 UTC m=+6184.804525151" lastFinishedPulling="2025-12-16 08:38:52.88052916 +0000 UTC m=+6211.369095283" observedRunningTime="2025-12-16 08:39:28.068279394 +0000 UTC m=+6246.556845537" watchObservedRunningTime="2025-12-16 08:39:28.075201181 +0000 UTC m=+6246.563767304" Dec 16 08:39:36 crc kubenswrapper[4823]: I1216 08:39:36.101435 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.246:5671: connect: connection refused" Dec 16 08:39:45 crc kubenswrapper[4823]: I1216 08:39:45.797177 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 08:39:46 crc kubenswrapper[4823]: I1216 08:39:46.101400 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.706214 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-62nfz"] Dec 16 08:39:51 crc kubenswrapper[4823]: E1216 08:39:51.707176 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerName="dnsmasq-dns" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.707196 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerName="dnsmasq-dns" Dec 16 08:39:51 crc kubenswrapper[4823]: E1216 08:39:51.707221 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerName="init" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.707229 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerName="init" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.707418 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1ab111-67ba-4e81-8879-e8c1d0b00b1f" containerName="dnsmasq-dns" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.708450 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.718101 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-62nfz"] Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.780943 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-dns-svc\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.781004 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-config\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.781115 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l66b\" (UniqueName: \"kubernetes.io/projected/4042708f-2c2b-4c71-adb8-c8c9a12c3284-kube-api-access-8l66b\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.882749 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l66b\" (UniqueName: \"kubernetes.io/projected/4042708f-2c2b-4c71-adb8-c8c9a12c3284-kube-api-access-8l66b\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.882870 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-dns-svc\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.882911 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-config\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.884248 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-dns-svc\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.884257 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-config\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:51 crc kubenswrapper[4823]: I1216 08:39:51.904840 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l66b\" (UniqueName: \"kubernetes.io/projected/4042708f-2c2b-4c71-adb8-c8c9a12c3284-kube-api-access-8l66b\") pod \"dnsmasq-dns-55db7cd99c-62nfz\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:52 crc kubenswrapper[4823]: I1216 08:39:52.029167 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:52 crc kubenswrapper[4823]: I1216 08:39:52.329619 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:39:52 crc kubenswrapper[4823]: I1216 08:39:52.485003 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-62nfz"] Dec 16 08:39:52 crc kubenswrapper[4823]: W1216 08:39:52.496611 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4042708f_2c2b_4c71_adb8_c8c9a12c3284.slice/crio-72755a305d84714f5026b4124d1492bd95a45e33fe4637363b02d8dda5287ad7 WatchSource:0}: Error finding container 72755a305d84714f5026b4124d1492bd95a45e33fe4637363b02d8dda5287ad7: Status 404 returned error can't find the container with id 72755a305d84714f5026b4124d1492bd95a45e33fe4637363b02d8dda5287ad7 Dec 16 08:39:53 crc kubenswrapper[4823]: I1216 08:39:53.146266 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:39:53 crc kubenswrapper[4823]: I1216 08:39:53.215938 4823 generic.go:334] "Generic (PLEG): container finished" podID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerID="055c62ef347cd361e5bfe2a423fe2a269cb09bd4337b3aa1f7ab1e229c9dc996" exitCode=0 Dec 16 08:39:53 crc kubenswrapper[4823]: I1216 08:39:53.215989 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" event={"ID":"4042708f-2c2b-4c71-adb8-c8c9a12c3284","Type":"ContainerDied","Data":"055c62ef347cd361e5bfe2a423fe2a269cb09bd4337b3aa1f7ab1e229c9dc996"} Dec 16 08:39:53 crc kubenswrapper[4823]: I1216 08:39:53.216017 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" event={"ID":"4042708f-2c2b-4c71-adb8-c8c9a12c3284","Type":"ContainerStarted","Data":"72755a305d84714f5026b4124d1492bd95a45e33fe4637363b02d8dda5287ad7"} Dec 16 08:39:54 crc kubenswrapper[4823]: I1216 08:39:54.224168 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" event={"ID":"4042708f-2c2b-4c71-adb8-c8c9a12c3284","Type":"ContainerStarted","Data":"99ade9bb130226e2e9065e905ac5bf166a1e5dc8ccad306acdfb239b54e73335"} Dec 16 08:39:54 crc kubenswrapper[4823]: I1216 08:39:54.224495 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:39:54 crc kubenswrapper[4823]: I1216 08:39:54.251257 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" podStartSLOduration=3.251241236 podStartE2EDuration="3.251241236s" podCreationTimestamp="2025-12-16 08:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:39:54.244770102 +0000 UTC m=+6272.733336235" watchObservedRunningTime="2025-12-16 08:39:54.251241236 +0000 UTC m=+6272.739807359" Dec 16 08:39:56 crc kubenswrapper[4823]: I1216 08:39:56.784971 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerName="rabbitmq" containerID="cri-o://14c4bd1e581701e98b6e3817b9960d1fa2696f6db0c16fe383c19ac5ad07c985" gracePeriod=604796 Dec 16 08:39:57 crc kubenswrapper[4823]: I1216 08:39:57.271160 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="rabbitmq" containerID="cri-o://0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791" gracePeriod=604796 Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.030251 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.084543 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57484c487-k9rhm"] Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.084820 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57484c487-k9rhm" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerName="dnsmasq-dns" containerID="cri-o://bd090d7e4221f56585efd589621a160fafa31d2c93e35507533a0931abdec9a3" gracePeriod=10 Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.292865 4823 generic.go:334] "Generic (PLEG): container finished" podID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerID="bd090d7e4221f56585efd589621a160fafa31d2c93e35507533a0931abdec9a3" exitCode=0 Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.292903 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-k9rhm" event={"ID":"e0565c2a-5f25-4b81-b27d-764c8dc154b9","Type":"ContainerDied","Data":"bd090d7e4221f56585efd589621a160fafa31d2c93e35507533a0931abdec9a3"} Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.495089 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.552819 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl5gs\" (UniqueName: \"kubernetes.io/projected/e0565c2a-5f25-4b81-b27d-764c8dc154b9-kube-api-access-pl5gs\") pod \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.553002 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-config\") pod \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.553075 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-dns-svc\") pod \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\" (UID: \"e0565c2a-5f25-4b81-b27d-764c8dc154b9\") " Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.558438 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0565c2a-5f25-4b81-b27d-764c8dc154b9-kube-api-access-pl5gs" (OuterVolumeSpecName: "kube-api-access-pl5gs") pod "e0565c2a-5f25-4b81-b27d-764c8dc154b9" (UID: "e0565c2a-5f25-4b81-b27d-764c8dc154b9"). InnerVolumeSpecName "kube-api-access-pl5gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.596818 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0565c2a-5f25-4b81-b27d-764c8dc154b9" (UID: "e0565c2a-5f25-4b81-b27d-764c8dc154b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.602091 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-config" (OuterVolumeSpecName: "config") pod "e0565c2a-5f25-4b81-b27d-764c8dc154b9" (UID: "e0565c2a-5f25-4b81-b27d-764c8dc154b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.655202 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.655243 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl5gs\" (UniqueName: \"kubernetes.io/projected/e0565c2a-5f25-4b81-b27d-764c8dc154b9-kube-api-access-pl5gs\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:02 crc kubenswrapper[4823]: I1216 08:40:02.655279 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0565c2a-5f25-4b81-b27d-764c8dc154b9-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.300643 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-k9rhm" event={"ID":"e0565c2a-5f25-4b81-b27d-764c8dc154b9","Type":"ContainerDied","Data":"e4a9b591a6c465261ac6151c96e3ec2bb782a0f62ae85bd2f9bb87b568533b5c"} Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.300683 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-k9rhm" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.300708 4823 scope.go:117] "RemoveContainer" containerID="bd090d7e4221f56585efd589621a160fafa31d2c93e35507533a0931abdec9a3" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.303391 4823 generic.go:334] "Generic (PLEG): container finished" podID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerID="14c4bd1e581701e98b6e3817b9960d1fa2696f6db0c16fe383c19ac5ad07c985" exitCode=0 Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.303427 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c541c676-03e3-4756-bbfa-770b1d9c3712","Type":"ContainerDied","Data":"14c4bd1e581701e98b6e3817b9960d1fa2696f6db0c16fe383c19ac5ad07c985"} Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.320397 4823 scope.go:117] "RemoveContainer" containerID="1ef96a6489a70f037006955f18c093f97b851e5f52bcc2976737b4bac159c91b" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.330751 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57484c487-k9rhm"] Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.337239 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57484c487-k9rhm"] Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.641496 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.781423 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" path="/var/lib/kubelet/pods/e0565c2a-5f25-4b81-b27d-764c8dc154b9/volumes" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784549 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-server-conf\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784596 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-tls\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784660 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-plugins\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784714 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-plugins-conf\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784743 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z9jx\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-kube-api-access-8z9jx\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784773 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c541c676-03e3-4756-bbfa-770b1d9c3712-erlang-cookie-secret\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784804 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-erlang-cookie\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784841 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-config-data\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.784870 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c541c676-03e3-4756-bbfa-770b1d9c3712-pod-info\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.785009 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.785069 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-confd\") pod \"c541c676-03e3-4756-bbfa-770b1d9c3712\" (UID: \"c541c676-03e3-4756-bbfa-770b1d9c3712\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.785154 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.785396 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.785405 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.785640 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.791752 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.791845 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c541c676-03e3-4756-bbfa-770b1d9c3712-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.792698 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-kube-api-access-8z9jx" (OuterVolumeSpecName: "kube-api-access-8z9jx") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "kube-api-access-8z9jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.812246 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c541c676-03e3-4756-bbfa-770b1d9c3712-pod-info" (OuterVolumeSpecName: "pod-info") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.814586 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47" (OuterVolumeSpecName: "persistence") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "pvc-43028e97-28b0-43cc-9122-0ff68d03ac47". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.827231 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-config-data" (OuterVolumeSpecName: "config-data") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.832573 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-server-conf" (OuterVolumeSpecName: "server-conf") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.881600 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c541c676-03e3-4756-bbfa-770b1d9c3712" (UID: "c541c676-03e3-4756-bbfa-770b1d9c3712"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886474 4823 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c541c676-03e3-4756-bbfa-770b1d9c3712-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886520 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886541 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886557 4823 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c541c676-03e3-4756-bbfa-770b1d9c3712-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886636 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") on node \"crc\" " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886657 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886674 4823 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886689 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886704 4823 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c541c676-03e3-4756-bbfa-770b1d9c3712-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.886718 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z9jx\" (UniqueName: \"kubernetes.io/projected/c541c676-03e3-4756-bbfa-770b1d9c3712-kube-api-access-8z9jx\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.905127 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.905388 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-43028e97-28b0-43cc-9122-0ff68d03ac47" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47") on node "crc" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.943881 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987649 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crkdq\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-kube-api-access-crkdq\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987709 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-confd\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987753 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-tls\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987784 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-config-data\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987809 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-erlang-cookie\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987887 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-server-conf\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.987919 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dd01190-adb1-4224-9a3d-b6da96bad2e8-pod-info\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.988094 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.988136 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-plugins\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.988166 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dd01190-adb1-4224-9a3d-b6da96bad2e8-erlang-cookie-secret\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.988199 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-plugins-conf\") pod \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\" (UID: \"3dd01190-adb1-4224-9a3d-b6da96bad2e8\") " Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.988535 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.989297 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.989779 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:03 crc kubenswrapper[4823]: I1216 08:40:03.990126 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.003409 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3dd01190-adb1-4224-9a3d-b6da96bad2e8-pod-info" (OuterVolumeSpecName: "pod-info") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.003446 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd01190-adb1-4224-9a3d-b6da96bad2e8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.003459 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-kube-api-access-crkdq" (OuterVolumeSpecName: "kube-api-access-crkdq") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "kube-api-access-crkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.003530 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.003685 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928" (OuterVolumeSpecName: "persistence") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.032389 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-config-data" (OuterVolumeSpecName: "config-data") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.041971 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-server-conf" (OuterVolumeSpecName: "server-conf") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.070834 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3dd01190-adb1-4224-9a3d-b6da96bad2e8" (UID: "3dd01190-adb1-4224-9a3d-b6da96bad2e8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.089739 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crkdq\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-kube-api-access-crkdq\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090139 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090152 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090163 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090174 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090184 4823 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090195 4823 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3dd01190-adb1-4224-9a3d-b6da96bad2e8-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090264 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") on node \"crc\" " Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090278 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3dd01190-adb1-4224-9a3d-b6da96bad2e8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090290 4823 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3dd01190-adb1-4224-9a3d-b6da96bad2e8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.090303 4823 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3dd01190-adb1-4224-9a3d-b6da96bad2e8-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.111282 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.111462 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928") on node "crc" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.195831 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.311853 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c541c676-03e3-4756-bbfa-770b1d9c3712","Type":"ContainerDied","Data":"5fc71f18a11c235f26a8e442f2496563cd0c3f45fb0682805164e91f1e68c3a0"} Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.311914 4823 scope.go:117] "RemoveContainer" containerID="14c4bd1e581701e98b6e3817b9960d1fa2696f6db0c16fe383c19ac5ad07c985" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.312010 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.317720 4823 generic.go:334] "Generic (PLEG): container finished" podID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerID="0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791" exitCode=0 Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.317783 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dd01190-adb1-4224-9a3d-b6da96bad2e8","Type":"ContainerDied","Data":"0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791"} Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.317810 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3dd01190-adb1-4224-9a3d-b6da96bad2e8","Type":"ContainerDied","Data":"518afd86843d9b668ac6fd907073b9fd7c373d81f6d2cda667c8ff4335710d21"} Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.317836 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.341147 4823 scope.go:117] "RemoveContainer" containerID="dd454651131887859f882a2ddc94dd427d8d7dc13d843dd0887cc643e74061fd" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.366160 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.372829 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.377129 4823 scope.go:117] "RemoveContainer" containerID="0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.385068 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.393050 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401216 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.401532 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerName="init" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401545 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerName="init" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.401558 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerName="dnsmasq-dns" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401565 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerName="dnsmasq-dns" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.401577 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerName="rabbitmq" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401584 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerName="rabbitmq" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.401597 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerName="setup-container" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401605 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerName="setup-container" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.401621 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="setup-container" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401626 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="setup-container" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.401642 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="rabbitmq" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401648 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="rabbitmq" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401776 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" containerName="rabbitmq" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401790 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" containerName="rabbitmq" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.401801 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0565c2a-5f25-4b81-b27d-764c8dc154b9" containerName="dnsmasq-dns" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.402617 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.405666 4823 scope.go:117] "RemoveContainer" containerID="8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.408193 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.408729 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.408761 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.409222 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.409249 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.409344 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.410056 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2twnw" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.411321 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.412678 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415112 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415270 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415459 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415509 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415582 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415650 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.415709 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lz7bq" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.422484 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.438233 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.443494 4823 scope.go:117] "RemoveContainer" containerID="0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.447042 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791\": container with ID starting with 0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791 not found: ID does not exist" containerID="0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.447078 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791"} err="failed to get container status \"0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791\": rpc error: code = NotFound desc = could not find container \"0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791\": container with ID starting with 0e2231fe1099a36b67ffccb0764845b7ac30d09084441806b710c4416d98f791 not found: ID does not exist" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.447102 4823 scope.go:117] "RemoveContainer" containerID="8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585" Dec 16 08:40:04 crc kubenswrapper[4823]: E1216 08:40:04.447359 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585\": container with ID starting with 8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585 not found: ID does not exist" containerID="8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.447375 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585"} err="failed to get container status \"8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585\": rpc error: code = NotFound desc = could not find container \"8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585\": container with ID starting with 8eff4ec096d3a02f98a190c46e9b5a752230e3176744fe63e9fd95090303b585 not found: ID does not exist" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.498950 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499115 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499150 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499176 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499204 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499241 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499264 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499289 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499311 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499464 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499512 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6bpx\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-kube-api-access-k6bpx\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499555 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499616 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499642 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499675 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc7hd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-kube-api-access-jc7hd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499732 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499798 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499829 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499858 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499895 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499939 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.499964 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601538 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601574 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601600 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601629 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601661 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601683 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601710 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601760 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601784 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601812 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6bpx\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-kube-api-access-k6bpx\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601838 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601871 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601896 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601924 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc7hd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-kube-api-access-jc7hd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601946 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601976 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.601993 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.602011 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.602045 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.602070 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.602089 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.602576 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.602619 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.603614 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.603700 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.603971 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604470 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604483 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604505 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/32117fdd5ecf1abacaad99c35b93f20179735b8b5200ad830b36f632cf8604dd/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604519 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/425127be2ca6cf7b0df5a079ff52c2fb95106fbffb7f1066620281ae7e3b9826/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604480 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604753 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.604772 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.607460 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.608121 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.608215 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.608251 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.609088 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.610134 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.617743 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.623293 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc7hd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-kube-api-access-jc7hd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.623453 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.623894 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.623901 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.624774 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6bpx\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-kube-api-access-k6bpx\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.654313 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.655650 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"rabbitmq-server-0\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.759047 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:40:04 crc kubenswrapper[4823]: I1216 08:40:04.772578 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:05 crc kubenswrapper[4823]: I1216 08:40:05.222368 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:40:05 crc kubenswrapper[4823]: I1216 08:40:05.268891 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:40:05 crc kubenswrapper[4823]: I1216 08:40:05.327404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf14ab2c-212b-406f-b102-2a4b8a7a29f5","Type":"ContainerStarted","Data":"c294eafbea9c890227fd9ce45cfa0707928e1be74231bf94729d59d733b88a4a"} Dec 16 08:40:05 crc kubenswrapper[4823]: I1216 08:40:05.329827 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7","Type":"ContainerStarted","Data":"86a94d8cfb482dfbcf39e835cfac93099d2b5eec84a5a12e62c12d196edc07db"} Dec 16 08:40:05 crc kubenswrapper[4823]: I1216 08:40:05.783779 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd01190-adb1-4224-9a3d-b6da96bad2e8" path="/var/lib/kubelet/pods/3dd01190-adb1-4224-9a3d-b6da96bad2e8/volumes" Dec 16 08:40:05 crc kubenswrapper[4823]: I1216 08:40:05.785947 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c541c676-03e3-4756-bbfa-770b1d9c3712" path="/var/lib/kubelet/pods/c541c676-03e3-4756-bbfa-770b1d9c3712/volumes" Dec 16 08:40:07 crc kubenswrapper[4823]: I1216 08:40:07.354722 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7","Type":"ContainerStarted","Data":"9021e9fbfdca4e8c3bc783918228e38655a4e6f41df2ee2221badc7c3cbd77ae"} Dec 16 08:40:07 crc kubenswrapper[4823]: I1216 08:40:07.356822 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf14ab2c-212b-406f-b102-2a4b8a7a29f5","Type":"ContainerStarted","Data":"4429a4325a93631f06371dd2afacb4eb2d4fb8581157516905df32e1ec8033bf"} Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.414498 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lj7lb"] Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.416918 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.431582 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj7lb"] Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.546273 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgw2l\" (UniqueName: \"kubernetes.io/projected/ff738828-d13b-4b49-a235-009530c09d0b-kube-api-access-dgw2l\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.546326 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-utilities\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.546632 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-catalog-content\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.647879 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-catalog-content\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.647963 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgw2l\" (UniqueName: \"kubernetes.io/projected/ff738828-d13b-4b49-a235-009530c09d0b-kube-api-access-dgw2l\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.648001 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-utilities\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.648486 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-catalog-content\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.648548 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-utilities\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.666998 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgw2l\" (UniqueName: \"kubernetes.io/projected/ff738828-d13b-4b49-a235-009530c09d0b-kube-api-access-dgw2l\") pod \"community-operators-lj7lb\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:17 crc kubenswrapper[4823]: I1216 08:40:17.737804 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:18 crc kubenswrapper[4823]: I1216 08:40:18.022564 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj7lb"] Dec 16 08:40:18 crc kubenswrapper[4823]: I1216 08:40:18.439462 4823 generic.go:334] "Generic (PLEG): container finished" podID="ff738828-d13b-4b49-a235-009530c09d0b" containerID="df5f749314b71bb33e44e84070d0a4a517af36c241c4ad443c1592bca381316a" exitCode=0 Dec 16 08:40:18 crc kubenswrapper[4823]: I1216 08:40:18.439713 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj7lb" event={"ID":"ff738828-d13b-4b49-a235-009530c09d0b","Type":"ContainerDied","Data":"df5f749314b71bb33e44e84070d0a4a517af36c241c4ad443c1592bca381316a"} Dec 16 08:40:18 crc kubenswrapper[4823]: I1216 08:40:18.439737 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj7lb" event={"ID":"ff738828-d13b-4b49-a235-009530c09d0b","Type":"ContainerStarted","Data":"a4bf961e5f5507fb2a1339a11d4b64d5d2dd9ad72b651451707d58236782943e"} Dec 16 08:40:20 crc kubenswrapper[4823]: I1216 08:40:20.457699 4823 generic.go:334] "Generic (PLEG): container finished" podID="ff738828-d13b-4b49-a235-009530c09d0b" containerID="0511b0fd1043822e5dce7ec8be854f25e3e32ff3c3e9b2bf07be5dc7c0067b67" exitCode=0 Dec 16 08:40:20 crc kubenswrapper[4823]: I1216 08:40:20.457800 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj7lb" event={"ID":"ff738828-d13b-4b49-a235-009530c09d0b","Type":"ContainerDied","Data":"0511b0fd1043822e5dce7ec8be854f25e3e32ff3c3e9b2bf07be5dc7c0067b67"} Dec 16 08:40:21 crc kubenswrapper[4823]: I1216 08:40:21.467974 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj7lb" event={"ID":"ff738828-d13b-4b49-a235-009530c09d0b","Type":"ContainerStarted","Data":"70b78fada47bb5e44f9a19e68bdfb94dbedecbf9966c52957ba94affc8276955"} Dec 16 08:40:21 crc kubenswrapper[4823]: I1216 08:40:21.490978 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lj7lb" podStartSLOduration=1.8901133639999999 podStartE2EDuration="4.490958195s" podCreationTimestamp="2025-12-16 08:40:17 +0000 UTC" firstStartedPulling="2025-12-16 08:40:18.441345429 +0000 UTC m=+6296.929911552" lastFinishedPulling="2025-12-16 08:40:21.04219023 +0000 UTC m=+6299.530756383" observedRunningTime="2025-12-16 08:40:21.485842035 +0000 UTC m=+6299.974408158" watchObservedRunningTime="2025-12-16 08:40:21.490958195 +0000 UTC m=+6299.979524318" Dec 16 08:40:27 crc kubenswrapper[4823]: I1216 08:40:27.738950 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:27 crc kubenswrapper[4823]: I1216 08:40:27.739497 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:27 crc kubenswrapper[4823]: I1216 08:40:27.788630 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:28 crc kubenswrapper[4823]: I1216 08:40:28.133493 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:40:28 crc kubenswrapper[4823]: I1216 08:40:28.133553 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:40:28 crc kubenswrapper[4823]: I1216 08:40:28.567724 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:28 crc kubenswrapper[4823]: I1216 08:40:28.625709 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lj7lb"] Dec 16 08:40:30 crc kubenswrapper[4823]: I1216 08:40:30.537891 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lj7lb" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="registry-server" containerID="cri-o://70b78fada47bb5e44f9a19e68bdfb94dbedecbf9966c52957ba94affc8276955" gracePeriod=2 Dec 16 08:40:31 crc kubenswrapper[4823]: I1216 08:40:31.548110 4823 generic.go:334] "Generic (PLEG): container finished" podID="ff738828-d13b-4b49-a235-009530c09d0b" containerID="70b78fada47bb5e44f9a19e68bdfb94dbedecbf9966c52957ba94affc8276955" exitCode=0 Dec 16 08:40:31 crc kubenswrapper[4823]: I1216 08:40:31.548204 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj7lb" event={"ID":"ff738828-d13b-4b49-a235-009530c09d0b","Type":"ContainerDied","Data":"70b78fada47bb5e44f9a19e68bdfb94dbedecbf9966c52957ba94affc8276955"} Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.128820 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.290598 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-utilities\") pod \"ff738828-d13b-4b49-a235-009530c09d0b\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.290725 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgw2l\" (UniqueName: \"kubernetes.io/projected/ff738828-d13b-4b49-a235-009530c09d0b-kube-api-access-dgw2l\") pod \"ff738828-d13b-4b49-a235-009530c09d0b\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.290771 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-catalog-content\") pod \"ff738828-d13b-4b49-a235-009530c09d0b\" (UID: \"ff738828-d13b-4b49-a235-009530c09d0b\") " Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.291674 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-utilities" (OuterVolumeSpecName: "utilities") pod "ff738828-d13b-4b49-a235-009530c09d0b" (UID: "ff738828-d13b-4b49-a235-009530c09d0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.304146 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff738828-d13b-4b49-a235-009530c09d0b-kube-api-access-dgw2l" (OuterVolumeSpecName: "kube-api-access-dgw2l") pod "ff738828-d13b-4b49-a235-009530c09d0b" (UID: "ff738828-d13b-4b49-a235-009530c09d0b"). InnerVolumeSpecName "kube-api-access-dgw2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.342820 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff738828-d13b-4b49-a235-009530c09d0b" (UID: "ff738828-d13b-4b49-a235-009530c09d0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.393135 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.393175 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgw2l\" (UniqueName: \"kubernetes.io/projected/ff738828-d13b-4b49-a235-009530c09d0b-kube-api-access-dgw2l\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.393189 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff738828-d13b-4b49-a235-009530c09d0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.557724 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj7lb" event={"ID":"ff738828-d13b-4b49-a235-009530c09d0b","Type":"ContainerDied","Data":"a4bf961e5f5507fb2a1339a11d4b64d5d2dd9ad72b651451707d58236782943e"} Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.557773 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj7lb" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.557785 4823 scope.go:117] "RemoveContainer" containerID="70b78fada47bb5e44f9a19e68bdfb94dbedecbf9966c52957ba94affc8276955" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.576131 4823 scope.go:117] "RemoveContainer" containerID="0511b0fd1043822e5dce7ec8be854f25e3e32ff3c3e9b2bf07be5dc7c0067b67" Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.589649 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lj7lb"] Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.596867 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lj7lb"] Dec 16 08:40:32 crc kubenswrapper[4823]: I1216 08:40:32.614489 4823 scope.go:117] "RemoveContainer" containerID="df5f749314b71bb33e44e84070d0a4a517af36c241c4ad443c1592bca381316a" Dec 16 08:40:33 crc kubenswrapper[4823]: I1216 08:40:33.788646 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff738828-d13b-4b49-a235-009530c09d0b" path="/var/lib/kubelet/pods/ff738828-d13b-4b49-a235-009530c09d0b/volumes" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.922052 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsblm"] Dec 16 08:40:36 crc kubenswrapper[4823]: E1216 08:40:36.922731 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="extract-content" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.922748 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="extract-content" Dec 16 08:40:36 crc kubenswrapper[4823]: E1216 08:40:36.922766 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="registry-server" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.922774 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="registry-server" Dec 16 08:40:36 crc kubenswrapper[4823]: E1216 08:40:36.922793 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="extract-utilities" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.922800 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="extract-utilities" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.922986 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff738828-d13b-4b49-a235-009530c09d0b" containerName="registry-server" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.931431 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:36 crc kubenswrapper[4823]: I1216 08:40:36.936572 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsblm"] Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.062570 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjmr\" (UniqueName: \"kubernetes.io/projected/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-kube-api-access-4xjmr\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.062689 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-utilities\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.062713 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-catalog-content\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.164240 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjmr\" (UniqueName: \"kubernetes.io/projected/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-kube-api-access-4xjmr\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.164376 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-utilities\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.164400 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-catalog-content\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.164820 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-utilities\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.164895 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-catalog-content\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.185602 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjmr\" (UniqueName: \"kubernetes.io/projected/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-kube-api-access-4xjmr\") pod \"certified-operators-vsblm\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.255771 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:37 crc kubenswrapper[4823]: I1216 08:40:37.750702 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsblm"] Dec 16 08:40:38 crc kubenswrapper[4823]: I1216 08:40:38.598056 4823 generic.go:334] "Generic (PLEG): container finished" podID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerID="346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be" exitCode=0 Dec 16 08:40:38 crc kubenswrapper[4823]: I1216 08:40:38.598191 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerDied","Data":"346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be"} Dec 16 08:40:38 crc kubenswrapper[4823]: I1216 08:40:38.598365 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerStarted","Data":"bd1920cc9443db7d16e681157c1c69fad86543aafff4996ec6a652ea2096dec9"} Dec 16 08:40:39 crc kubenswrapper[4823]: I1216 08:40:39.607597 4823 generic.go:334] "Generic (PLEG): container finished" podID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerID="9021e9fbfdca4e8c3bc783918228e38655a4e6f41df2ee2221badc7c3cbd77ae" exitCode=0 Dec 16 08:40:39 crc kubenswrapper[4823]: I1216 08:40:39.607727 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7","Type":"ContainerDied","Data":"9021e9fbfdca4e8c3bc783918228e38655a4e6f41df2ee2221badc7c3cbd77ae"} Dec 16 08:40:39 crc kubenswrapper[4823]: I1216 08:40:39.609802 4823 generic.go:334] "Generic (PLEG): container finished" podID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerID="4429a4325a93631f06371dd2afacb4eb2d4fb8581157516905df32e1ec8033bf" exitCode=0 Dec 16 08:40:39 crc kubenswrapper[4823]: I1216 08:40:39.609844 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf14ab2c-212b-406f-b102-2a4b8a7a29f5","Type":"ContainerDied","Data":"4429a4325a93631f06371dd2afacb4eb2d4fb8581157516905df32e1ec8033bf"} Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.619448 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf14ab2c-212b-406f-b102-2a4b8a7a29f5","Type":"ContainerStarted","Data":"aeb46928562b9e0657f49ff73daa201ffbf7d9ddbfda61724d79baa281b28aab"} Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.620298 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.621450 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7","Type":"ContainerStarted","Data":"49741b7980cd55e4afdfdbd68688aadb6380c0d69a35239bfa62de2454502776"} Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.621993 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.624329 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerStarted","Data":"08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4"} Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.642921 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.642898653 podStartE2EDuration="36.642898653s" podCreationTimestamp="2025-12-16 08:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:40:40.641058756 +0000 UTC m=+6319.129624899" watchObservedRunningTime="2025-12-16 08:40:40.642898653 +0000 UTC m=+6319.131464776" Dec 16 08:40:40 crc kubenswrapper[4823]: I1216 08:40:40.695012 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.694988965 podStartE2EDuration="36.694988965s" podCreationTimestamp="2025-12-16 08:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:40:40.691726204 +0000 UTC m=+6319.180292337" watchObservedRunningTime="2025-12-16 08:40:40.694988965 +0000 UTC m=+6319.183555088" Dec 16 08:40:41 crc kubenswrapper[4823]: I1216 08:40:41.631985 4823 generic.go:334] "Generic (PLEG): container finished" podID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerID="08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4" exitCode=0 Dec 16 08:40:41 crc kubenswrapper[4823]: I1216 08:40:41.633127 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerDied","Data":"08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4"} Dec 16 08:40:43 crc kubenswrapper[4823]: I1216 08:40:43.649652 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerStarted","Data":"1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e"} Dec 16 08:40:43 crc kubenswrapper[4823]: I1216 08:40:43.676122 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsblm" podStartSLOduration=3.666285906 podStartE2EDuration="7.676101694s" podCreationTimestamp="2025-12-16 08:40:36 +0000 UTC" firstStartedPulling="2025-12-16 08:40:38.600183594 +0000 UTC m=+6317.088749717" lastFinishedPulling="2025-12-16 08:40:42.609999382 +0000 UTC m=+6321.098565505" observedRunningTime="2025-12-16 08:40:43.669170507 +0000 UTC m=+6322.157736640" watchObservedRunningTime="2025-12-16 08:40:43.676101694 +0000 UTC m=+6322.164667837" Dec 16 08:40:47 crc kubenswrapper[4823]: I1216 08:40:47.256803 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:47 crc kubenswrapper[4823]: I1216 08:40:47.257715 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:47 crc kubenswrapper[4823]: I1216 08:40:47.310053 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:47 crc kubenswrapper[4823]: I1216 08:40:47.721361 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:48 crc kubenswrapper[4823]: I1216 08:40:48.553797 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsblm"] Dec 16 08:40:49 crc kubenswrapper[4823]: I1216 08:40:49.693663 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vsblm" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="registry-server" containerID="cri-o://1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e" gracePeriod=2 Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.116053 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.169747 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjmr\" (UniqueName: \"kubernetes.io/projected/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-kube-api-access-4xjmr\") pod \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.169892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-utilities\") pod \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.169933 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-catalog-content\") pod \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\" (UID: \"e533fcdc-f5a8-4a1b-9ab7-7947969f9111\") " Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.171134 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-utilities" (OuterVolumeSpecName: "utilities") pod "e533fcdc-f5a8-4a1b-9ab7-7947969f9111" (UID: "e533fcdc-f5a8-4a1b-9ab7-7947969f9111"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.175790 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-kube-api-access-4xjmr" (OuterVolumeSpecName: "kube-api-access-4xjmr") pod "e533fcdc-f5a8-4a1b-9ab7-7947969f9111" (UID: "e533fcdc-f5a8-4a1b-9ab7-7947969f9111"). InnerVolumeSpecName "kube-api-access-4xjmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.233774 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e533fcdc-f5a8-4a1b-9ab7-7947969f9111" (UID: "e533fcdc-f5a8-4a1b-9ab7-7947969f9111"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.271310 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.271347 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.271363 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjmr\" (UniqueName: \"kubernetes.io/projected/e533fcdc-f5a8-4a1b-9ab7-7947969f9111-kube-api-access-4xjmr\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.706261 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsblm" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.706302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerDied","Data":"1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e"} Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.706375 4823 scope.go:117] "RemoveContainer" containerID="1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.706233 4823 generic.go:334] "Generic (PLEG): container finished" podID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerID="1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e" exitCode=0 Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.706646 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsblm" event={"ID":"e533fcdc-f5a8-4a1b-9ab7-7947969f9111","Type":"ContainerDied","Data":"bd1920cc9443db7d16e681157c1c69fad86543aafff4996ec6a652ea2096dec9"} Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.752651 4823 scope.go:117] "RemoveContainer" containerID="08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.756946 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsblm"] Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.763662 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vsblm"] Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.776186 4823 scope.go:117] "RemoveContainer" containerID="346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.808653 4823 scope.go:117] "RemoveContainer" containerID="1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e" Dec 16 08:40:50 crc kubenswrapper[4823]: E1216 08:40:50.809170 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e\": container with ID starting with 1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e not found: ID does not exist" containerID="1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.809240 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e"} err="failed to get container status \"1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e\": rpc error: code = NotFound desc = could not find container \"1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e\": container with ID starting with 1840a85318ebd6c21867fb4e94b18ffd422b3fe4af9765ff0d248553392bbb4e not found: ID does not exist" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.809279 4823 scope.go:117] "RemoveContainer" containerID="08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4" Dec 16 08:40:50 crc kubenswrapper[4823]: E1216 08:40:50.809719 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4\": container with ID starting with 08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4 not found: ID does not exist" containerID="08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.809760 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4"} err="failed to get container status \"08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4\": rpc error: code = NotFound desc = could not find container \"08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4\": container with ID starting with 08496c7eaefac191288ba5197dfc079528f208232929859208d9703b4d453fd4 not found: ID does not exist" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.809785 4823 scope.go:117] "RemoveContainer" containerID="346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be" Dec 16 08:40:50 crc kubenswrapper[4823]: E1216 08:40:50.810116 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be\": container with ID starting with 346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be not found: ID does not exist" containerID="346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be" Dec 16 08:40:50 crc kubenswrapper[4823]: I1216 08:40:50.810135 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be"} err="failed to get container status \"346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be\": rpc error: code = NotFound desc = could not find container \"346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be\": container with ID starting with 346ed6599035ab87278f3034ea44d19b97766d91a91a4508be84adeeaa5173be not found: ID does not exist" Dec 16 08:40:51 crc kubenswrapper[4823]: I1216 08:40:51.781847 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" path="/var/lib/kubelet/pods/e533fcdc-f5a8-4a1b-9ab7-7947969f9111/volumes" Dec 16 08:40:54 crc kubenswrapper[4823]: I1216 08:40:54.763242 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 08:40:54 crc kubenswrapper[4823]: I1216 08:40:54.775368 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.134409 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.134724 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.285554 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:40:58 crc kubenswrapper[4823]: E1216 08:40:58.285921 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="extract-utilities" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.285946 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="extract-utilities" Dec 16 08:40:58 crc kubenswrapper[4823]: E1216 08:40:58.285963 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="registry-server" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.285971 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="registry-server" Dec 16 08:40:58 crc kubenswrapper[4823]: E1216 08:40:58.285987 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="extract-content" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.285995 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="extract-content" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.286221 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e533fcdc-f5a8-4a1b-9ab7-7947969f9111" containerName="registry-server" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.286831 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.289032 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mcfnl" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.292561 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.407906 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c-kube-api-access-ckqj7\") pod \"mariadb-client-1-default\" (UID: \"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c\") " pod="openstack/mariadb-client-1-default" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.509138 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c-kube-api-access-ckqj7\") pod \"mariadb-client-1-default\" (UID: \"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c\") " pod="openstack/mariadb-client-1-default" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.537041 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c-kube-api-access-ckqj7\") pod \"mariadb-client-1-default\" (UID: \"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c\") " pod="openstack/mariadb-client-1-default" Dec 16 08:40:58 crc kubenswrapper[4823]: I1216 08:40:58.609505 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:40:59 crc kubenswrapper[4823]: I1216 08:40:59.133048 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:40:59 crc kubenswrapper[4823]: W1216 08:40:59.139185 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb5aa95_b9fe_4f49_aae6_16236e2efd0c.slice/crio-14fb6d87dc2d6bce14040256d85f4d5770ccb15c1b424ffc647f47eb66b7d571 WatchSource:0}: Error finding container 14fb6d87dc2d6bce14040256d85f4d5770ccb15c1b424ffc647f47eb66b7d571: Status 404 returned error can't find the container with id 14fb6d87dc2d6bce14040256d85f4d5770ccb15c1b424ffc647f47eb66b7d571 Dec 16 08:40:59 crc kubenswrapper[4823]: I1216 08:40:59.782210 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c","Type":"ContainerStarted","Data":"14fb6d87dc2d6bce14040256d85f4d5770ccb15c1b424ffc647f47eb66b7d571"} Dec 16 08:41:00 crc kubenswrapper[4823]: I1216 08:41:00.789503 4823 generic.go:334] "Generic (PLEG): container finished" podID="9bb5aa95-b9fe-4f49-aae6-16236e2efd0c" containerID="bc65fd721fd7c67bf7529175e89a79673a070d6ec4732f446c3d2585d3fa2364" exitCode=0 Dec 16 08:41:00 crc kubenswrapper[4823]: I1216 08:41:00.789572 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c","Type":"ContainerDied","Data":"bc65fd721fd7c67bf7529175e89a79673a070d6ec4732f446c3d2585d3fa2364"} Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.143979 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.171061 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_9bb5aa95-b9fe-4f49-aae6-16236e2efd0c/mariadb-client-1-default/0.log" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.172804 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c-kube-api-access-ckqj7\") pod \"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c\" (UID: \"9bb5aa95-b9fe-4f49-aae6-16236e2efd0c\") " Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.179401 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c-kube-api-access-ckqj7" (OuterVolumeSpecName: "kube-api-access-ckqj7") pod "9bb5aa95-b9fe-4f49-aae6-16236e2efd0c" (UID: "9bb5aa95-b9fe-4f49-aae6-16236e2efd0c"). InnerVolumeSpecName "kube-api-access-ckqj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.200952 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.205855 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.274733 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqj7\" (UniqueName: \"kubernetes.io/projected/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c-kube-api-access-ckqj7\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.593888 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:41:02 crc kubenswrapper[4823]: E1216 08:41:02.594327 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb5aa95-b9fe-4f49-aae6-16236e2efd0c" containerName="mariadb-client-1-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.594350 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb5aa95-b9fe-4f49-aae6-16236e2efd0c" containerName="mariadb-client-1-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.594555 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb5aa95-b9fe-4f49-aae6-16236e2efd0c" containerName="mariadb-client-1-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.595256 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.601112 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.680445 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7x8b\" (UniqueName: \"kubernetes.io/projected/00a8b6c5-7461-457b-9745-ca93da9e5c35-kube-api-access-z7x8b\") pod \"mariadb-client-2-default\" (UID: \"00a8b6c5-7461-457b-9745-ca93da9e5c35\") " pod="openstack/mariadb-client-2-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.782151 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7x8b\" (UniqueName: \"kubernetes.io/projected/00a8b6c5-7461-457b-9745-ca93da9e5c35-kube-api-access-z7x8b\") pod \"mariadb-client-2-default\" (UID: \"00a8b6c5-7461-457b-9745-ca93da9e5c35\") " pod="openstack/mariadb-client-2-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.804950 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7x8b\" (UniqueName: \"kubernetes.io/projected/00a8b6c5-7461-457b-9745-ca93da9e5c35-kube-api-access-z7x8b\") pod \"mariadb-client-2-default\" (UID: \"00a8b6c5-7461-457b-9745-ca93da9e5c35\") " pod="openstack/mariadb-client-2-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.810271 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fb6d87dc2d6bce14040256d85f4d5770ccb15c1b424ffc647f47eb66b7d571" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.810401 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:41:02 crc kubenswrapper[4823]: I1216 08:41:02.910737 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:41:03 crc kubenswrapper[4823]: W1216 08:41:03.408107 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00a8b6c5_7461_457b_9745_ca93da9e5c35.slice/crio-7de80da0ef1c0f8f43e665d89081f60383686ab0a809ed43fde3a217a6f48608 WatchSource:0}: Error finding container 7de80da0ef1c0f8f43e665d89081f60383686ab0a809ed43fde3a217a6f48608: Status 404 returned error can't find the container with id 7de80da0ef1c0f8f43e665d89081f60383686ab0a809ed43fde3a217a6f48608 Dec 16 08:41:03 crc kubenswrapper[4823]: I1216 08:41:03.408841 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:41:03 crc kubenswrapper[4823]: I1216 08:41:03.784435 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb5aa95-b9fe-4f49-aae6-16236e2efd0c" path="/var/lib/kubelet/pods/9bb5aa95-b9fe-4f49-aae6-16236e2efd0c/volumes" Dec 16 08:41:03 crc kubenswrapper[4823]: I1216 08:41:03.820567 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"00a8b6c5-7461-457b-9745-ca93da9e5c35","Type":"ContainerStarted","Data":"3e914c35e26a8301b1c2f74bc9e95002ba04447b28cb1f2f82975fd96570dbcd"} Dec 16 08:41:03 crc kubenswrapper[4823]: I1216 08:41:03.821183 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"00a8b6c5-7461-457b-9745-ca93da9e5c35","Type":"ContainerStarted","Data":"7de80da0ef1c0f8f43e665d89081f60383686ab0a809ed43fde3a217a6f48608"} Dec 16 08:41:03 crc kubenswrapper[4823]: I1216 08:41:03.845650 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.8456254539999999 podStartE2EDuration="1.845625454s" podCreationTimestamp="2025-12-16 08:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:41:03.840099501 +0000 UTC m=+6342.328665634" watchObservedRunningTime="2025-12-16 08:41:03.845625454 +0000 UTC m=+6342.334191577" Dec 16 08:41:04 crc kubenswrapper[4823]: I1216 08:41:04.830767 4823 generic.go:334] "Generic (PLEG): container finished" podID="00a8b6c5-7461-457b-9745-ca93da9e5c35" containerID="3e914c35e26a8301b1c2f74bc9e95002ba04447b28cb1f2f82975fd96570dbcd" exitCode=0 Dec 16 08:41:04 crc kubenswrapper[4823]: I1216 08:41:04.830820 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"00a8b6c5-7461-457b-9745-ca93da9e5c35","Type":"ContainerDied","Data":"3e914c35e26a8301b1c2f74bc9e95002ba04447b28cb1f2f82975fd96570dbcd"} Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.198403 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.239624 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.245589 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.338356 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7x8b\" (UniqueName: \"kubernetes.io/projected/00a8b6c5-7461-457b-9745-ca93da9e5c35-kube-api-access-z7x8b\") pod \"00a8b6c5-7461-457b-9745-ca93da9e5c35\" (UID: \"00a8b6c5-7461-457b-9745-ca93da9e5c35\") " Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.345665 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a8b6c5-7461-457b-9745-ca93da9e5c35-kube-api-access-z7x8b" (OuterVolumeSpecName: "kube-api-access-z7x8b") pod "00a8b6c5-7461-457b-9745-ca93da9e5c35" (UID: "00a8b6c5-7461-457b-9745-ca93da9e5c35"). InnerVolumeSpecName "kube-api-access-z7x8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.388802 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-3-default"] Dec 16 08:41:06 crc kubenswrapper[4823]: E1216 08:41:06.389534 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a8b6c5-7461-457b-9745-ca93da9e5c35" containerName="mariadb-client-2-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.389553 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a8b6c5-7461-457b-9745-ca93da9e5c35" containerName="mariadb-client-2-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.389694 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a8b6c5-7461-457b-9745-ca93da9e5c35" containerName="mariadb-client-2-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.390231 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.394413 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.440843 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7x8b\" (UniqueName: \"kubernetes.io/projected/00a8b6c5-7461-457b-9745-ca93da9e5c35-kube-api-access-z7x8b\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.542785 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv2qq\" (UniqueName: \"kubernetes.io/projected/af9c959a-f959-4db8-b31d-28d7a4299384-kube-api-access-dv2qq\") pod \"mariadb-client-3-default\" (UID: \"af9c959a-f959-4db8-b31d-28d7a4299384\") " pod="openstack/mariadb-client-3-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.645142 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv2qq\" (UniqueName: \"kubernetes.io/projected/af9c959a-f959-4db8-b31d-28d7a4299384-kube-api-access-dv2qq\") pod \"mariadb-client-3-default\" (UID: \"af9c959a-f959-4db8-b31d-28d7a4299384\") " pod="openstack/mariadb-client-3-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.666584 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv2qq\" (UniqueName: \"kubernetes.io/projected/af9c959a-f959-4db8-b31d-28d7a4299384-kube-api-access-dv2qq\") pod \"mariadb-client-3-default\" (UID: \"af9c959a-f959-4db8-b31d-28d7a4299384\") " pod="openstack/mariadb-client-3-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.725958 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.863854 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de80da0ef1c0f8f43e665d89081f60383686ab0a809ed43fde3a217a6f48608" Dec 16 08:41:06 crc kubenswrapper[4823]: I1216 08:41:06.864101 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:41:07 crc kubenswrapper[4823]: I1216 08:41:07.302178 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 16 08:41:07 crc kubenswrapper[4823]: I1216 08:41:07.782175 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a8b6c5-7461-457b-9745-ca93da9e5c35" path="/var/lib/kubelet/pods/00a8b6c5-7461-457b-9745-ca93da9e5c35/volumes" Dec 16 08:41:07 crc kubenswrapper[4823]: I1216 08:41:07.872242 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"af9c959a-f959-4db8-b31d-28d7a4299384","Type":"ContainerStarted","Data":"10b50e8e4273e9de7344824c2b0d3bab1e1cd973a869dd705c37ced286b27405"} Dec 16 08:41:07 crc kubenswrapper[4823]: I1216 08:41:07.872301 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"af9c959a-f959-4db8-b31d-28d7a4299384","Type":"ContainerStarted","Data":"73784e1d57146e26e72792bf655c372347b764e0ae971f1a7e2fcc8a588f0983"} Dec 16 08:41:07 crc kubenswrapper[4823]: I1216 08:41:07.886475 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-3-default" podStartSLOduration=1.886454265 podStartE2EDuration="1.886454265s" podCreationTimestamp="2025-12-16 08:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:41:07.883517603 +0000 UTC m=+6346.372083746" watchObservedRunningTime="2025-12-16 08:41:07.886454265 +0000 UTC m=+6346.375020388" Dec 16 08:41:09 crc kubenswrapper[4823]: I1216 08:41:09.886673 4823 generic.go:334] "Generic (PLEG): container finished" podID="af9c959a-f959-4db8-b31d-28d7a4299384" containerID="10b50e8e4273e9de7344824c2b0d3bab1e1cd973a869dd705c37ced286b27405" exitCode=0 Dec 16 08:41:09 crc kubenswrapper[4823]: I1216 08:41:09.886762 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"af9c959a-f959-4db8-b31d-28d7a4299384","Type":"ContainerDied","Data":"10b50e8e4273e9de7344824c2b0d3bab1e1cd973a869dd705c37ced286b27405"} Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.268783 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.301957 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.307046 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-3-default"] Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.431849 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv2qq\" (UniqueName: \"kubernetes.io/projected/af9c959a-f959-4db8-b31d-28d7a4299384-kube-api-access-dv2qq\") pod \"af9c959a-f959-4db8-b31d-28d7a4299384\" (UID: \"af9c959a-f959-4db8-b31d-28d7a4299384\") " Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.441424 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9c959a-f959-4db8-b31d-28d7a4299384-kube-api-access-dv2qq" (OuterVolumeSpecName: "kube-api-access-dv2qq") pod "af9c959a-f959-4db8-b31d-28d7a4299384" (UID: "af9c959a-f959-4db8-b31d-28d7a4299384"). InnerVolumeSpecName "kube-api-access-dv2qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.533607 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv2qq\" (UniqueName: \"kubernetes.io/projected/af9c959a-f959-4db8-b31d-28d7a4299384-kube-api-access-dv2qq\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.711060 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:41:11 crc kubenswrapper[4823]: E1216 08:41:11.711447 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9c959a-f959-4db8-b31d-28d7a4299384" containerName="mariadb-client-3-default" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.711466 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9c959a-f959-4db8-b31d-28d7a4299384" containerName="mariadb-client-3-default" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.711650 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9c959a-f959-4db8-b31d-28d7a4299384" containerName="mariadb-client-3-default" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.712358 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.718693 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.786194 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9c959a-f959-4db8-b31d-28d7a4299384" path="/var/lib/kubelet/pods/af9c959a-f959-4db8-b31d-28d7a4299384/volumes" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.838214 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjpz\" (UniqueName: \"kubernetes.io/projected/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1-kube-api-access-ffjpz\") pod \"mariadb-client-1\" (UID: \"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1\") " pod="openstack/mariadb-client-1" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.902775 4823 scope.go:117] "RemoveContainer" containerID="10b50e8e4273e9de7344824c2b0d3bab1e1cd973a869dd705c37ced286b27405" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.902831 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.940380 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjpz\" (UniqueName: \"kubernetes.io/projected/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1-kube-api-access-ffjpz\") pod \"mariadb-client-1\" (UID: \"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1\") " pod="openstack/mariadb-client-1" Dec 16 08:41:11 crc kubenswrapper[4823]: I1216 08:41:11.961940 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjpz\" (UniqueName: \"kubernetes.io/projected/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1-kube-api-access-ffjpz\") pod \"mariadb-client-1\" (UID: \"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1\") " pod="openstack/mariadb-client-1" Dec 16 08:41:12 crc kubenswrapper[4823]: I1216 08:41:12.030394 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:41:12 crc kubenswrapper[4823]: I1216 08:41:12.522504 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:41:12 crc kubenswrapper[4823]: I1216 08:41:12.912964 4823 generic.go:334] "Generic (PLEG): container finished" podID="13d9807e-f5d7-4b89-a4c1-9b29912c6ba1" containerID="d7063c97ce184876ec9b65547d7fb8ba587fc200b68dd522df2c81481dbf07df" exitCode=0 Dec 16 08:41:12 crc kubenswrapper[4823]: I1216 08:41:12.913021 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1","Type":"ContainerDied","Data":"d7063c97ce184876ec9b65547d7fb8ba587fc200b68dd522df2c81481dbf07df"} Dec 16 08:41:12 crc kubenswrapper[4823]: I1216 08:41:12.913074 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1","Type":"ContainerStarted","Data":"bff6277e5099cde4bd823b9b04a4a215b52d329514bc4b322d8c8cbcda82348c"} Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.263141 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.290785 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_13d9807e-f5d7-4b89-a4c1-9b29912c6ba1/mariadb-client-1/0.log" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.318543 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.329803 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.381720 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjpz\" (UniqueName: \"kubernetes.io/projected/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1-kube-api-access-ffjpz\") pod \"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1\" (UID: \"13d9807e-f5d7-4b89-a4c1-9b29912c6ba1\") " Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.387065 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1-kube-api-access-ffjpz" (OuterVolumeSpecName: "kube-api-access-ffjpz") pod "13d9807e-f5d7-4b89-a4c1-9b29912c6ba1" (UID: "13d9807e-f5d7-4b89-a4c1-9b29912c6ba1"). InnerVolumeSpecName "kube-api-access-ffjpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.484048 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjpz\" (UniqueName: \"kubernetes.io/projected/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1-kube-api-access-ffjpz\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.705129 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:41:14 crc kubenswrapper[4823]: E1216 08:41:14.705465 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d9807e-f5d7-4b89-a4c1-9b29912c6ba1" containerName="mariadb-client-1" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.705485 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d9807e-f5d7-4b89-a4c1-9b29912c6ba1" containerName="mariadb-client-1" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.705640 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d9807e-f5d7-4b89-a4c1-9b29912c6ba1" containerName="mariadb-client-1" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.706163 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.713778 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.890042 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfszr\" (UniqueName: \"kubernetes.io/projected/0518e34b-5d33-4c2c-8891-42d9fbc0e62e-kube-api-access-lfszr\") pod \"mariadb-client-4-default\" (UID: \"0518e34b-5d33-4c2c-8891-42d9fbc0e62e\") " pod="openstack/mariadb-client-4-default" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.926710 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bff6277e5099cde4bd823b9b04a4a215b52d329514bc4b322d8c8cbcda82348c" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.927198 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:41:14 crc kubenswrapper[4823]: I1216 08:41:14.991418 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfszr\" (UniqueName: \"kubernetes.io/projected/0518e34b-5d33-4c2c-8891-42d9fbc0e62e-kube-api-access-lfszr\") pod \"mariadb-client-4-default\" (UID: \"0518e34b-5d33-4c2c-8891-42d9fbc0e62e\") " pod="openstack/mariadb-client-4-default" Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.017837 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfszr\" (UniqueName: \"kubernetes.io/projected/0518e34b-5d33-4c2c-8891-42d9fbc0e62e-kube-api-access-lfszr\") pod \"mariadb-client-4-default\" (UID: \"0518e34b-5d33-4c2c-8891-42d9fbc0e62e\") " pod="openstack/mariadb-client-4-default" Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.037683 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.513793 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.781623 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d9807e-f5d7-4b89-a4c1-9b29912c6ba1" path="/var/lib/kubelet/pods/13d9807e-f5d7-4b89-a4c1-9b29912c6ba1/volumes" Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.938387 4823 generic.go:334] "Generic (PLEG): container finished" podID="0518e34b-5d33-4c2c-8891-42d9fbc0e62e" containerID="982fb6f27ecd57de1fc19a00fcb1cbab57056d81b1f1107201de58b3b3c52433" exitCode=0 Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.938736 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"0518e34b-5d33-4c2c-8891-42d9fbc0e62e","Type":"ContainerDied","Data":"982fb6f27ecd57de1fc19a00fcb1cbab57056d81b1f1107201de58b3b3c52433"} Dec 16 08:41:15 crc kubenswrapper[4823]: I1216 08:41:15.938825 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"0518e34b-5d33-4c2c-8891-42d9fbc0e62e","Type":"ContainerStarted","Data":"2b0462204f975e45c14753800251cfc2c77c530528a8278e27e6263560344570"} Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.290314 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.309250 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_0518e34b-5d33-4c2c-8891-42d9fbc0e62e/mariadb-client-4-default/0.log" Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.335335 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.342988 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.372794 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfszr\" (UniqueName: \"kubernetes.io/projected/0518e34b-5d33-4c2c-8891-42d9fbc0e62e-kube-api-access-lfszr\") pod \"0518e34b-5d33-4c2c-8891-42d9fbc0e62e\" (UID: \"0518e34b-5d33-4c2c-8891-42d9fbc0e62e\") " Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.378622 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0518e34b-5d33-4c2c-8891-42d9fbc0e62e-kube-api-access-lfszr" (OuterVolumeSpecName: "kube-api-access-lfszr") pod "0518e34b-5d33-4c2c-8891-42d9fbc0e62e" (UID: "0518e34b-5d33-4c2c-8891-42d9fbc0e62e"). InnerVolumeSpecName "kube-api-access-lfszr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.474306 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfszr\" (UniqueName: \"kubernetes.io/projected/0518e34b-5d33-4c2c-8891-42d9fbc0e62e-kube-api-access-lfszr\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.791917 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0518e34b-5d33-4c2c-8891-42d9fbc0e62e" path="/var/lib/kubelet/pods/0518e34b-5d33-4c2c-8891-42d9fbc0e62e/volumes" Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.951963 4823 scope.go:117] "RemoveContainer" containerID="982fb6f27ecd57de1fc19a00fcb1cbab57056d81b1f1107201de58b3b3c52433" Dec 16 08:41:17 crc kubenswrapper[4823]: I1216 08:41:17.952060 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:41:20 crc kubenswrapper[4823]: I1216 08:41:20.998181 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:41:20 crc kubenswrapper[4823]: E1216 08:41:20.999074 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0518e34b-5d33-4c2c-8891-42d9fbc0e62e" containerName="mariadb-client-4-default" Dec 16 08:41:20 crc kubenswrapper[4823]: I1216 08:41:20.999086 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0518e34b-5d33-4c2c-8891-42d9fbc0e62e" containerName="mariadb-client-4-default" Dec 16 08:41:20 crc kubenswrapper[4823]: I1216 08:41:20.999225 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0518e34b-5d33-4c2c-8891-42d9fbc0e62e" containerName="mariadb-client-4-default" Dec 16 08:41:20 crc kubenswrapper[4823]: I1216 08:41:20.999733 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.001689 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mcfnl" Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.005805 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.133645 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4z9\" (UniqueName: \"kubernetes.io/projected/2c04fe89-992e-4bb0-ae97-9746bf82acc1-kube-api-access-mf4z9\") pod \"mariadb-client-5-default\" (UID: \"2c04fe89-992e-4bb0-ae97-9746bf82acc1\") " pod="openstack/mariadb-client-5-default" Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.235549 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf4z9\" (UniqueName: \"kubernetes.io/projected/2c04fe89-992e-4bb0-ae97-9746bf82acc1-kube-api-access-mf4z9\") pod \"mariadb-client-5-default\" (UID: \"2c04fe89-992e-4bb0-ae97-9746bf82acc1\") " pod="openstack/mariadb-client-5-default" Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.254205 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf4z9\" (UniqueName: \"kubernetes.io/projected/2c04fe89-992e-4bb0-ae97-9746bf82acc1-kube-api-access-mf4z9\") pod \"mariadb-client-5-default\" (UID: \"2c04fe89-992e-4bb0-ae97-9746bf82acc1\") " pod="openstack/mariadb-client-5-default" Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.319758 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.838105 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:41:21 crc kubenswrapper[4823]: I1216 08:41:21.985472 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"2c04fe89-992e-4bb0-ae97-9746bf82acc1","Type":"ContainerStarted","Data":"aa8eb630f9607f116076682e96a8f3b17e754b7f6e5ba93f17720a7c496c2a1b"} Dec 16 08:41:22 crc kubenswrapper[4823]: I1216 08:41:22.999230 4823 generic.go:334] "Generic (PLEG): container finished" podID="2c04fe89-992e-4bb0-ae97-9746bf82acc1" containerID="18df81425ceeb2a0cdb9bea5a167be02f2cade8a3185a442bce1fa38256c03b6" exitCode=0 Dec 16 08:41:22 crc kubenswrapper[4823]: I1216 08:41:22.999380 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"2c04fe89-992e-4bb0-ae97-9746bf82acc1","Type":"ContainerDied","Data":"18df81425ceeb2a0cdb9bea5a167be02f2cade8a3185a442bce1fa38256c03b6"} Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.336173 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.355611 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_2c04fe89-992e-4bb0-ae97-9746bf82acc1/mariadb-client-5-default/0.log" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.380312 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.385493 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.386561 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf4z9\" (UniqueName: \"kubernetes.io/projected/2c04fe89-992e-4bb0-ae97-9746bf82acc1-kube-api-access-mf4z9\") pod \"2c04fe89-992e-4bb0-ae97-9746bf82acc1\" (UID: \"2c04fe89-992e-4bb0-ae97-9746bf82acc1\") " Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.391336 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c04fe89-992e-4bb0-ae97-9746bf82acc1-kube-api-access-mf4z9" (OuterVolumeSpecName: "kube-api-access-mf4z9") pod "2c04fe89-992e-4bb0-ae97-9746bf82acc1" (UID: "2c04fe89-992e-4bb0-ae97-9746bf82acc1"). InnerVolumeSpecName "kube-api-access-mf4z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.488050 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf4z9\" (UniqueName: \"kubernetes.io/projected/2c04fe89-992e-4bb0-ae97-9746bf82acc1-kube-api-access-mf4z9\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.503493 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:41:24 crc kubenswrapper[4823]: E1216 08:41:24.503812 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c04fe89-992e-4bb0-ae97-9746bf82acc1" containerName="mariadb-client-5-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.503827 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c04fe89-992e-4bb0-ae97-9746bf82acc1" containerName="mariadb-client-5-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.503958 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c04fe89-992e-4bb0-ae97-9746bf82acc1" containerName="mariadb-client-5-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.504456 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.521575 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.590038 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szjl8\" (UniqueName: \"kubernetes.io/projected/b83144b8-369c-426d-a0d4-d77b4867f93c-kube-api-access-szjl8\") pod \"mariadb-client-6-default\" (UID: \"b83144b8-369c-426d-a0d4-d77b4867f93c\") " pod="openstack/mariadb-client-6-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.692169 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szjl8\" (UniqueName: \"kubernetes.io/projected/b83144b8-369c-426d-a0d4-d77b4867f93c-kube-api-access-szjl8\") pod \"mariadb-client-6-default\" (UID: \"b83144b8-369c-426d-a0d4-d77b4867f93c\") " pod="openstack/mariadb-client-6-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.721801 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szjl8\" (UniqueName: \"kubernetes.io/projected/b83144b8-369c-426d-a0d4-d77b4867f93c-kube-api-access-szjl8\") pod \"mariadb-client-6-default\" (UID: \"b83144b8-369c-426d-a0d4-d77b4867f93c\") " pod="openstack/mariadb-client-6-default" Dec 16 08:41:24 crc kubenswrapper[4823]: I1216 08:41:24.822671 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:41:25 crc kubenswrapper[4823]: I1216 08:41:25.016329 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa8eb630f9607f116076682e96a8f3b17e754b7f6e5ba93f17720a7c496c2a1b" Dec 16 08:41:25 crc kubenswrapper[4823]: I1216 08:41:25.016401 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:41:25 crc kubenswrapper[4823]: I1216 08:41:25.316229 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:41:25 crc kubenswrapper[4823]: I1216 08:41:25.782394 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c04fe89-992e-4bb0-ae97-9746bf82acc1" path="/var/lib/kubelet/pods/2c04fe89-992e-4bb0-ae97-9746bf82acc1/volumes" Dec 16 08:41:26 crc kubenswrapper[4823]: I1216 08:41:26.026442 4823 generic.go:334] "Generic (PLEG): container finished" podID="b83144b8-369c-426d-a0d4-d77b4867f93c" containerID="71602f86d7d6b53a6bb06ebeb0531dac28dfd9c91958d55e6065c4f296f34715" exitCode=1 Dec 16 08:41:26 crc kubenswrapper[4823]: I1216 08:41:26.026485 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"b83144b8-369c-426d-a0d4-d77b4867f93c","Type":"ContainerDied","Data":"71602f86d7d6b53a6bb06ebeb0531dac28dfd9c91958d55e6065c4f296f34715"} Dec 16 08:41:26 crc kubenswrapper[4823]: I1216 08:41:26.026512 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"b83144b8-369c-426d-a0d4-d77b4867f93c","Type":"ContainerStarted","Data":"ae869d0fa4667c7428a4aa24da9f6652dda52bf19be9570d6848891705b56ca5"} Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.409889 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.427113 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_b83144b8-369c-426d-a0d4-d77b4867f93c/mariadb-client-6-default/0.log" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.437661 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szjl8\" (UniqueName: \"kubernetes.io/projected/b83144b8-369c-426d-a0d4-d77b4867f93c-kube-api-access-szjl8\") pod \"b83144b8-369c-426d-a0d4-d77b4867f93c\" (UID: \"b83144b8-369c-426d-a0d4-d77b4867f93c\") " Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.444200 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83144b8-369c-426d-a0d4-d77b4867f93c-kube-api-access-szjl8" (OuterVolumeSpecName: "kube-api-access-szjl8") pod "b83144b8-369c-426d-a0d4-d77b4867f93c" (UID: "b83144b8-369c-426d-a0d4-d77b4867f93c"). InnerVolumeSpecName "kube-api-access-szjl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.458109 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.463175 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.540487 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szjl8\" (UniqueName: \"kubernetes.io/projected/b83144b8-369c-426d-a0d4-d77b4867f93c-kube-api-access-szjl8\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.604378 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:41:27 crc kubenswrapper[4823]: E1216 08:41:27.604952 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83144b8-369c-426d-a0d4-d77b4867f93c" containerName="mariadb-client-6-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.604971 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83144b8-369c-426d-a0d4-d77b4867f93c" containerName="mariadb-client-6-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.605195 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83144b8-369c-426d-a0d4-d77b4867f93c" containerName="mariadb-client-6-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.605801 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.613924 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.642402 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftd8m\" (UniqueName: \"kubernetes.io/projected/097360a6-b73b-428a-9d70-18430739a654-kube-api-access-ftd8m\") pod \"mariadb-client-7-default\" (UID: \"097360a6-b73b-428a-9d70-18430739a654\") " pod="openstack/mariadb-client-7-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.744381 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftd8m\" (UniqueName: \"kubernetes.io/projected/097360a6-b73b-428a-9d70-18430739a654-kube-api-access-ftd8m\") pod \"mariadb-client-7-default\" (UID: \"097360a6-b73b-428a-9d70-18430739a654\") " pod="openstack/mariadb-client-7-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.765838 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftd8m\" (UniqueName: \"kubernetes.io/projected/097360a6-b73b-428a-9d70-18430739a654-kube-api-access-ftd8m\") pod \"mariadb-client-7-default\" (UID: \"097360a6-b73b-428a-9d70-18430739a654\") " pod="openstack/mariadb-client-7-default" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.782317 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83144b8-369c-426d-a0d4-d77b4867f93c" path="/var/lib/kubelet/pods/b83144b8-369c-426d-a0d4-d77b4867f93c/volumes" Dec 16 08:41:27 crc kubenswrapper[4823]: I1216 08:41:27.970588 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.045758 4823 scope.go:117] "RemoveContainer" containerID="71602f86d7d6b53a6bb06ebeb0531dac28dfd9c91958d55e6065c4f296f34715" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.045835 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.134289 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.134602 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.134658 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.135379 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.135446 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" gracePeriod=600 Dec 16 08:41:28 crc kubenswrapper[4823]: E1216 08:41:28.268072 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:41:28 crc kubenswrapper[4823]: I1216 08:41:28.460937 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:41:28 crc kubenswrapper[4823]: W1216 08:41:28.465633 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod097360a6_b73b_428a_9d70_18430739a654.slice/crio-8a3cb77a25c6e99323941cc3a2b1626aad71a54b191aa2e5d98e36280b47ca23 WatchSource:0}: Error finding container 8a3cb77a25c6e99323941cc3a2b1626aad71a54b191aa2e5d98e36280b47ca23: Status 404 returned error can't find the container with id 8a3cb77a25c6e99323941cc3a2b1626aad71a54b191aa2e5d98e36280b47ca23 Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.055073 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" exitCode=0 Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.055151 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4"} Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.055197 4823 scope.go:117] "RemoveContainer" containerID="294b6c0f0228f2baf018382a3d963ffc8968e2a2b2dcfcf14ae472b2ce45e535" Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.055692 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:41:29 crc kubenswrapper[4823]: E1216 08:41:29.055936 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.058133 4823 generic.go:334] "Generic (PLEG): container finished" podID="097360a6-b73b-428a-9d70-18430739a654" containerID="1a73f7002a007d6a34058ac251455bb9bf68eeb908a2ef77175507944ba4d4e5" exitCode=0 Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.058182 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"097360a6-b73b-428a-9d70-18430739a654","Type":"ContainerDied","Data":"1a73f7002a007d6a34058ac251455bb9bf68eeb908a2ef77175507944ba4d4e5"} Dec 16 08:41:29 crc kubenswrapper[4823]: I1216 08:41:29.058613 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"097360a6-b73b-428a-9d70-18430739a654","Type":"ContainerStarted","Data":"8a3cb77a25c6e99323941cc3a2b1626aad71a54b191aa2e5d98e36280b47ca23"} Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.430268 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.447373 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_097360a6-b73b-428a-9d70-18430739a654/mariadb-client-7-default/0.log" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.473448 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.481203 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.486017 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftd8m\" (UniqueName: \"kubernetes.io/projected/097360a6-b73b-428a-9d70-18430739a654-kube-api-access-ftd8m\") pod \"097360a6-b73b-428a-9d70-18430739a654\" (UID: \"097360a6-b73b-428a-9d70-18430739a654\") " Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.496259 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097360a6-b73b-428a-9d70-18430739a654-kube-api-access-ftd8m" (OuterVolumeSpecName: "kube-api-access-ftd8m") pod "097360a6-b73b-428a-9d70-18430739a654" (UID: "097360a6-b73b-428a-9d70-18430739a654"). InnerVolumeSpecName "kube-api-access-ftd8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.588197 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftd8m\" (UniqueName: \"kubernetes.io/projected/097360a6-b73b-428a-9d70-18430739a654-kube-api-access-ftd8m\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.596656 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:41:30 crc kubenswrapper[4823]: E1216 08:41:30.597632 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097360a6-b73b-428a-9d70-18430739a654" containerName="mariadb-client-7-default" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.597652 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="097360a6-b73b-428a-9d70-18430739a654" containerName="mariadb-client-7-default" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.597781 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="097360a6-b73b-428a-9d70-18430739a654" containerName="mariadb-client-7-default" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.600174 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.608160 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.690852 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkzfz\" (UniqueName: \"kubernetes.io/projected/815ed9bd-002f-40b5-af0d-191df6ac85d2-kube-api-access-fkzfz\") pod \"mariadb-client-2\" (UID: \"815ed9bd-002f-40b5-af0d-191df6ac85d2\") " pod="openstack/mariadb-client-2" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.792543 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkzfz\" (UniqueName: \"kubernetes.io/projected/815ed9bd-002f-40b5-af0d-191df6ac85d2-kube-api-access-fkzfz\") pod \"mariadb-client-2\" (UID: \"815ed9bd-002f-40b5-af0d-191df6ac85d2\") " pod="openstack/mariadb-client-2" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.818715 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkzfz\" (UniqueName: \"kubernetes.io/projected/815ed9bd-002f-40b5-af0d-191df6ac85d2-kube-api-access-fkzfz\") pod \"mariadb-client-2\" (UID: \"815ed9bd-002f-40b5-af0d-191df6ac85d2\") " pod="openstack/mariadb-client-2" Dec 16 08:41:30 crc kubenswrapper[4823]: I1216 08:41:30.917798 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:41:31 crc kubenswrapper[4823]: I1216 08:41:31.076194 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a3cb77a25c6e99323941cc3a2b1626aad71a54b191aa2e5d98e36280b47ca23" Dec 16 08:41:31 crc kubenswrapper[4823]: I1216 08:41:31.076261 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:41:31 crc kubenswrapper[4823]: I1216 08:41:31.458885 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:41:31 crc kubenswrapper[4823]: W1216 08:41:31.464192 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod815ed9bd_002f_40b5_af0d_191df6ac85d2.slice/crio-f1b87c5d9ad1baf583bfe977d98cb2c4f0b1ef1ee9cd85352e8b553d6aa337c3 WatchSource:0}: Error finding container f1b87c5d9ad1baf583bfe977d98cb2c4f0b1ef1ee9cd85352e8b553d6aa337c3: Status 404 returned error can't find the container with id f1b87c5d9ad1baf583bfe977d98cb2c4f0b1ef1ee9cd85352e8b553d6aa337c3 Dec 16 08:41:31 crc kubenswrapper[4823]: I1216 08:41:31.780805 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097360a6-b73b-428a-9d70-18430739a654" path="/var/lib/kubelet/pods/097360a6-b73b-428a-9d70-18430739a654/volumes" Dec 16 08:41:32 crc kubenswrapper[4823]: I1216 08:41:32.083159 4823 generic.go:334] "Generic (PLEG): container finished" podID="815ed9bd-002f-40b5-af0d-191df6ac85d2" containerID="25ab55d96c3ef8fbd8e12410dd0470d6c46e32b7caf91658ea6476f2083c743b" exitCode=0 Dec 16 08:41:32 crc kubenswrapper[4823]: I1216 08:41:32.083210 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"815ed9bd-002f-40b5-af0d-191df6ac85d2","Type":"ContainerDied","Data":"25ab55d96c3ef8fbd8e12410dd0470d6c46e32b7caf91658ea6476f2083c743b"} Dec 16 08:41:32 crc kubenswrapper[4823]: I1216 08:41:32.083239 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"815ed9bd-002f-40b5-af0d-191df6ac85d2","Type":"ContainerStarted","Data":"f1b87c5d9ad1baf583bfe977d98cb2c4f0b1ef1ee9cd85352e8b553d6aa337c3"} Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.507608 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.562647 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_815ed9bd-002f-40b5-af0d-191df6ac85d2/mariadb-client-2/0.log" Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.585153 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.590110 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.662952 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkzfz\" (UniqueName: \"kubernetes.io/projected/815ed9bd-002f-40b5-af0d-191df6ac85d2-kube-api-access-fkzfz\") pod \"815ed9bd-002f-40b5-af0d-191df6ac85d2\" (UID: \"815ed9bd-002f-40b5-af0d-191df6ac85d2\") " Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.669188 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815ed9bd-002f-40b5-af0d-191df6ac85d2-kube-api-access-fkzfz" (OuterVolumeSpecName: "kube-api-access-fkzfz") pod "815ed9bd-002f-40b5-af0d-191df6ac85d2" (UID: "815ed9bd-002f-40b5-af0d-191df6ac85d2"). InnerVolumeSpecName "kube-api-access-fkzfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.765389 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkzfz\" (UniqueName: \"kubernetes.io/projected/815ed9bd-002f-40b5-af0d-191df6ac85d2-kube-api-access-fkzfz\") on node \"crc\" DevicePath \"\"" Dec 16 08:41:33 crc kubenswrapper[4823]: I1216 08:41:33.784839 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815ed9bd-002f-40b5-af0d-191df6ac85d2" path="/var/lib/kubelet/pods/815ed9bd-002f-40b5-af0d-191df6ac85d2/volumes" Dec 16 08:41:34 crc kubenswrapper[4823]: I1216 08:41:34.096377 4823 scope.go:117] "RemoveContainer" containerID="25ab55d96c3ef8fbd8e12410dd0470d6c46e32b7caf91658ea6476f2083c743b" Dec 16 08:41:34 crc kubenswrapper[4823]: I1216 08:41:34.096455 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:41:41 crc kubenswrapper[4823]: I1216 08:41:41.776781 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:41:41 crc kubenswrapper[4823]: E1216 08:41:41.777631 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:41:52 crc kubenswrapper[4823]: I1216 08:41:52.771921 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:41:52 crc kubenswrapper[4823]: E1216 08:41:52.772502 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:42:05 crc kubenswrapper[4823]: I1216 08:42:05.772256 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:42:05 crc kubenswrapper[4823]: E1216 08:42:05.773305 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:42:17 crc kubenswrapper[4823]: I1216 08:42:17.771512 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:42:17 crc kubenswrapper[4823]: E1216 08:42:17.772258 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:42:28 crc kubenswrapper[4823]: I1216 08:42:28.772205 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:42:28 crc kubenswrapper[4823]: E1216 08:42:28.773386 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:42:40 crc kubenswrapper[4823]: I1216 08:42:40.772048 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:42:40 crc kubenswrapper[4823]: E1216 08:42:40.772819 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:42:42 crc kubenswrapper[4823]: I1216 08:42:42.408558 4823 scope.go:117] "RemoveContainer" containerID="f550381959f87d979ba90f524766f86db98bf913dc564358a20b897de0ae2be3" Dec 16 08:42:52 crc kubenswrapper[4823]: I1216 08:42:52.772146 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:42:52 crc kubenswrapper[4823]: E1216 08:42:52.772916 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:43:04 crc kubenswrapper[4823]: I1216 08:43:04.772522 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:43:04 crc kubenswrapper[4823]: E1216 08:43:04.773381 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:43:16 crc kubenswrapper[4823]: I1216 08:43:16.772277 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:43:16 crc kubenswrapper[4823]: E1216 08:43:16.772900 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:43:30 crc kubenswrapper[4823]: I1216 08:43:30.771585 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:43:30 crc kubenswrapper[4823]: E1216 08:43:30.772338 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:43:42 crc kubenswrapper[4823]: I1216 08:43:42.772056 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:43:42 crc kubenswrapper[4823]: E1216 08:43:42.773798 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:43:53 crc kubenswrapper[4823]: I1216 08:43:53.772872 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:43:53 crc kubenswrapper[4823]: E1216 08:43:53.773806 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:44:07 crc kubenswrapper[4823]: I1216 08:44:07.771833 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:44:07 crc kubenswrapper[4823]: E1216 08:44:07.772960 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:44:19 crc kubenswrapper[4823]: I1216 08:44:19.771911 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:44:19 crc kubenswrapper[4823]: E1216 08:44:19.773523 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:44:31 crc kubenswrapper[4823]: I1216 08:44:31.776601 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:44:31 crc kubenswrapper[4823]: E1216 08:44:31.777430 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:44:44 crc kubenswrapper[4823]: I1216 08:44:44.772511 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:44:44 crc kubenswrapper[4823]: E1216 08:44:44.773297 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:44:59 crc kubenswrapper[4823]: I1216 08:44:59.772240 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:44:59 crc kubenswrapper[4823]: E1216 08:44:59.773009 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.154991 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5"] Dec 16 08:45:00 crc kubenswrapper[4823]: E1216 08:45:00.155692 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815ed9bd-002f-40b5-af0d-191df6ac85d2" containerName="mariadb-client-2" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.155711 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="815ed9bd-002f-40b5-af0d-191df6ac85d2" containerName="mariadb-client-2" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.155868 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="815ed9bd-002f-40b5-af0d-191df6ac85d2" containerName="mariadb-client-2" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.156462 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.162865 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5"] Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.210513 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e373ea9-e611-4955-be87-161a8c10b98d-secret-volume\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.210622 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jf4p\" (UniqueName: \"kubernetes.io/projected/5e373ea9-e611-4955-be87-161a8c10b98d-kube-api-access-4jf4p\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.210690 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e373ea9-e611-4955-be87-161a8c10b98d-config-volume\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.211837 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.212816 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.313162 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e373ea9-e611-4955-be87-161a8c10b98d-secret-volume\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.313241 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jf4p\" (UniqueName: \"kubernetes.io/projected/5e373ea9-e611-4955-be87-161a8c10b98d-kube-api-access-4jf4p\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.313301 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e373ea9-e611-4955-be87-161a8c10b98d-config-volume\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.314308 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e373ea9-e611-4955-be87-161a8c10b98d-config-volume\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.323269 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e373ea9-e611-4955-be87-161a8c10b98d-secret-volume\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.331790 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jf4p\" (UniqueName: \"kubernetes.io/projected/5e373ea9-e611-4955-be87-161a8c10b98d-kube-api-access-4jf4p\") pod \"collect-profiles-29431245-tpvm5\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.540370 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:00 crc kubenswrapper[4823]: I1216 08:45:00.962891 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5"] Dec 16 08:45:01 crc kubenswrapper[4823]: I1216 08:45:01.724858 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e373ea9-e611-4955-be87-161a8c10b98d" containerID="4ad9360a4fb969af9766ac1ded4bf60c6225e8f66351e5e5bba621f4febfa3a4" exitCode=0 Dec 16 08:45:01 crc kubenswrapper[4823]: I1216 08:45:01.724908 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" event={"ID":"5e373ea9-e611-4955-be87-161a8c10b98d","Type":"ContainerDied","Data":"4ad9360a4fb969af9766ac1ded4bf60c6225e8f66351e5e5bba621f4febfa3a4"} Dec 16 08:45:01 crc kubenswrapper[4823]: I1216 08:45:01.725192 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" event={"ID":"5e373ea9-e611-4955-be87-161a8c10b98d","Type":"ContainerStarted","Data":"92955662ef0348423c69053a140196952eb719c909280e9c549f8cacf89b0426"} Dec 16 08:45:02 crc kubenswrapper[4823]: I1216 08:45:02.987665 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.155846 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jf4p\" (UniqueName: \"kubernetes.io/projected/5e373ea9-e611-4955-be87-161a8c10b98d-kube-api-access-4jf4p\") pod \"5e373ea9-e611-4955-be87-161a8c10b98d\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.155903 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e373ea9-e611-4955-be87-161a8c10b98d-secret-volume\") pod \"5e373ea9-e611-4955-be87-161a8c10b98d\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.155946 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e373ea9-e611-4955-be87-161a8c10b98d-config-volume\") pod \"5e373ea9-e611-4955-be87-161a8c10b98d\" (UID: \"5e373ea9-e611-4955-be87-161a8c10b98d\") " Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.156932 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e373ea9-e611-4955-be87-161a8c10b98d-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e373ea9-e611-4955-be87-161a8c10b98d" (UID: "5e373ea9-e611-4955-be87-161a8c10b98d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.162346 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e373ea9-e611-4955-be87-161a8c10b98d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e373ea9-e611-4955-be87-161a8c10b98d" (UID: "5e373ea9-e611-4955-be87-161a8c10b98d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.162897 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e373ea9-e611-4955-be87-161a8c10b98d-kube-api-access-4jf4p" (OuterVolumeSpecName: "kube-api-access-4jf4p") pod "5e373ea9-e611-4955-be87-161a8c10b98d" (UID: "5e373ea9-e611-4955-be87-161a8c10b98d"). InnerVolumeSpecName "kube-api-access-4jf4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.257299 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e373ea9-e611-4955-be87-161a8c10b98d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.257336 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jf4p\" (UniqueName: \"kubernetes.io/projected/5e373ea9-e611-4955-be87-161a8c10b98d-kube-api-access-4jf4p\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.257350 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e373ea9-e611-4955-be87-161a8c10b98d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.737642 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" event={"ID":"5e373ea9-e611-4955-be87-161a8c10b98d","Type":"ContainerDied","Data":"92955662ef0348423c69053a140196952eb719c909280e9c549f8cacf89b0426"} Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.737683 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-tpvm5" Dec 16 08:45:03 crc kubenswrapper[4823]: I1216 08:45:03.737692 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92955662ef0348423c69053a140196952eb719c909280e9c549f8cacf89b0426" Dec 16 08:45:04 crc kubenswrapper[4823]: I1216 08:45:04.064329 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz"] Dec 16 08:45:04 crc kubenswrapper[4823]: I1216 08:45:04.070479 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-ctpzz"] Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.008066 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-48gcp"] Dec 16 08:45:05 crc kubenswrapper[4823]: E1216 08:45:05.008459 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e373ea9-e611-4955-be87-161a8c10b98d" containerName="collect-profiles" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.008485 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e373ea9-e611-4955-be87-161a8c10b98d" containerName="collect-profiles" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.008688 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e373ea9-e611-4955-be87-161a8c10b98d" containerName="collect-profiles" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.010050 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.028012 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-48gcp"] Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.186177 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-utilities\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.186222 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-catalog-content\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.186290 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65wp\" (UniqueName: \"kubernetes.io/projected/f827c95e-f78c-4f89-928b-29edf753c8be-kube-api-access-c65wp\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.287985 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-utilities\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.288069 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-catalog-content\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.288140 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65wp\" (UniqueName: \"kubernetes.io/projected/f827c95e-f78c-4f89-928b-29edf753c8be-kube-api-access-c65wp\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.288716 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-utilities\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.288790 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-catalog-content\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.322117 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65wp\" (UniqueName: \"kubernetes.io/projected/f827c95e-f78c-4f89-928b-29edf753c8be-kube-api-access-c65wp\") pod \"redhat-operators-48gcp\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.327732 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.782958 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e97bd637-59e2-4dfb-9935-6a84f4e46388" path="/var/lib/kubelet/pods/e97bd637-59e2-4dfb-9935-6a84f4e46388/volumes" Dec 16 08:45:05 crc kubenswrapper[4823]: I1216 08:45:05.811389 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-48gcp"] Dec 16 08:45:06 crc kubenswrapper[4823]: I1216 08:45:06.770796 4823 generic.go:334] "Generic (PLEG): container finished" podID="f827c95e-f78c-4f89-928b-29edf753c8be" containerID="a851e0b6991f10aa565066012d12bc97b1816449701a366e65aadaf520303af7" exitCode=0 Dec 16 08:45:06 crc kubenswrapper[4823]: I1216 08:45:06.770851 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48gcp" event={"ID":"f827c95e-f78c-4f89-928b-29edf753c8be","Type":"ContainerDied","Data":"a851e0b6991f10aa565066012d12bc97b1816449701a366e65aadaf520303af7"} Dec 16 08:45:06 crc kubenswrapper[4823]: I1216 08:45:06.770921 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48gcp" event={"ID":"f827c95e-f78c-4f89-928b-29edf753c8be","Type":"ContainerStarted","Data":"dfd4262c48365c4f50d6e3494cb847a6ac0b23493f76e12edfd7af381c0b9649"} Dec 16 08:45:06 crc kubenswrapper[4823]: I1216 08:45:06.774340 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:45:08 crc kubenswrapper[4823]: I1216 08:45:08.804384 4823 generic.go:334] "Generic (PLEG): container finished" podID="f827c95e-f78c-4f89-928b-29edf753c8be" containerID="f697bcc85ee87d2d60e5896bc5f26afacde7f20a174f3816b5f6282df566bf85" exitCode=0 Dec 16 08:45:08 crc kubenswrapper[4823]: I1216 08:45:08.804516 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48gcp" event={"ID":"f827c95e-f78c-4f89-928b-29edf753c8be","Type":"ContainerDied","Data":"f697bcc85ee87d2d60e5896bc5f26afacde7f20a174f3816b5f6282df566bf85"} Dec 16 08:45:09 crc kubenswrapper[4823]: I1216 08:45:09.816044 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48gcp" event={"ID":"f827c95e-f78c-4f89-928b-29edf753c8be","Type":"ContainerStarted","Data":"3fea5a410021f7991eabd8af44a5a77732721cad7a505622cf25ec7935c23365"} Dec 16 08:45:09 crc kubenswrapper[4823]: I1216 08:45:09.844885 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-48gcp" podStartSLOduration=3.393535515 podStartE2EDuration="5.844863324s" podCreationTimestamp="2025-12-16 08:45:04 +0000 UTC" firstStartedPulling="2025-12-16 08:45:06.773770343 +0000 UTC m=+6585.262336476" lastFinishedPulling="2025-12-16 08:45:09.225098152 +0000 UTC m=+6587.713664285" observedRunningTime="2025-12-16 08:45:09.83454058 +0000 UTC m=+6588.323106723" watchObservedRunningTime="2025-12-16 08:45:09.844863324 +0000 UTC m=+6588.333429447" Dec 16 08:45:11 crc kubenswrapper[4823]: I1216 08:45:11.775852 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:45:11 crc kubenswrapper[4823]: E1216 08:45:11.776385 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:45:15 crc kubenswrapper[4823]: I1216 08:45:15.328277 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:15 crc kubenswrapper[4823]: I1216 08:45:15.329131 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:15 crc kubenswrapper[4823]: I1216 08:45:15.388311 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:15 crc kubenswrapper[4823]: I1216 08:45:15.916952 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:15 crc kubenswrapper[4823]: I1216 08:45:15.973893 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-48gcp"] Dec 16 08:45:17 crc kubenswrapper[4823]: I1216 08:45:17.871310 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-48gcp" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="registry-server" containerID="cri-o://3fea5a410021f7991eabd8af44a5a77732721cad7a505622cf25ec7935c23365" gracePeriod=2 Dec 16 08:45:20 crc kubenswrapper[4823]: I1216 08:45:20.897584 4823 generic.go:334] "Generic (PLEG): container finished" podID="f827c95e-f78c-4f89-928b-29edf753c8be" containerID="3fea5a410021f7991eabd8af44a5a77732721cad7a505622cf25ec7935c23365" exitCode=0 Dec 16 08:45:20 crc kubenswrapper[4823]: I1216 08:45:20.897636 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48gcp" event={"ID":"f827c95e-f78c-4f89-928b-29edf753c8be","Type":"ContainerDied","Data":"3fea5a410021f7991eabd8af44a5a77732721cad7a505622cf25ec7935c23365"} Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.222479 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.359383 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-catalog-content\") pod \"f827c95e-f78c-4f89-928b-29edf753c8be\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.359429 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65wp\" (UniqueName: \"kubernetes.io/projected/f827c95e-f78c-4f89-928b-29edf753c8be-kube-api-access-c65wp\") pod \"f827c95e-f78c-4f89-928b-29edf753c8be\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.359463 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-utilities\") pod \"f827c95e-f78c-4f89-928b-29edf753c8be\" (UID: \"f827c95e-f78c-4f89-928b-29edf753c8be\") " Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.360449 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-utilities" (OuterVolumeSpecName: "utilities") pod "f827c95e-f78c-4f89-928b-29edf753c8be" (UID: "f827c95e-f78c-4f89-928b-29edf753c8be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.365240 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f827c95e-f78c-4f89-928b-29edf753c8be-kube-api-access-c65wp" (OuterVolumeSpecName: "kube-api-access-c65wp") pod "f827c95e-f78c-4f89-928b-29edf753c8be" (UID: "f827c95e-f78c-4f89-928b-29edf753c8be"). InnerVolumeSpecName "kube-api-access-c65wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.460783 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65wp\" (UniqueName: \"kubernetes.io/projected/f827c95e-f78c-4f89-928b-29edf753c8be-kube-api-access-c65wp\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.460825 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.494939 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f827c95e-f78c-4f89-928b-29edf753c8be" (UID: "f827c95e-f78c-4f89-928b-29edf753c8be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.562183 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f827c95e-f78c-4f89-928b-29edf753c8be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.907562 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-48gcp" event={"ID":"f827c95e-f78c-4f89-928b-29edf753c8be","Type":"ContainerDied","Data":"dfd4262c48365c4f50d6e3494cb847a6ac0b23493f76e12edfd7af381c0b9649"} Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.907633 4823 scope.go:117] "RemoveContainer" containerID="3fea5a410021f7991eabd8af44a5a77732721cad7a505622cf25ec7935c23365" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.907628 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-48gcp" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.931652 4823 scope.go:117] "RemoveContainer" containerID="f697bcc85ee87d2d60e5896bc5f26afacde7f20a174f3816b5f6282df566bf85" Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.934202 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-48gcp"] Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.943508 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-48gcp"] Dec 16 08:45:21 crc kubenswrapper[4823]: I1216 08:45:21.956519 4823 scope.go:117] "RemoveContainer" containerID="a851e0b6991f10aa565066012d12bc97b1816449701a366e65aadaf520303af7" Dec 16 08:45:23 crc kubenswrapper[4823]: I1216 08:45:23.785254 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" path="/var/lib/kubelet/pods/f827c95e-f78c-4f89-928b-29edf753c8be/volumes" Dec 16 08:45:25 crc kubenswrapper[4823]: I1216 08:45:25.771919 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:45:25 crc kubenswrapper[4823]: E1216 08:45:25.772660 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:45:37 crc kubenswrapper[4823]: I1216 08:45:37.772148 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:45:37 crc kubenswrapper[4823]: E1216 08:45:37.773188 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:45:42 crc kubenswrapper[4823]: I1216 08:45:42.486566 4823 scope.go:117] "RemoveContainer" containerID="b25c945c7478d3ce67781f66e14be73a8abf6e776a23d0b8d03a34c19bfdd70c" Dec 16 08:45:48 crc kubenswrapper[4823]: I1216 08:45:48.772109 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:45:48 crc kubenswrapper[4823]: E1216 08:45:48.773056 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:46:01 crc kubenswrapper[4823]: I1216 08:46:01.777095 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:46:01 crc kubenswrapper[4823]: E1216 08:46:01.777916 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:46:16 crc kubenswrapper[4823]: I1216 08:46:16.772150 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:46:16 crc kubenswrapper[4823]: E1216 08:46:16.773234 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:46:29 crc kubenswrapper[4823]: I1216 08:46:29.775080 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:46:30 crc kubenswrapper[4823]: I1216 08:46:30.560183 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"e7aa677772e57f6515b9bb17d98c3eab3e0272b84b5b66258a7f5432c5f0835b"} Dec 16 08:47:42 crc kubenswrapper[4823]: I1216 08:47:42.566565 4823 scope.go:117] "RemoveContainer" containerID="1a73f7002a007d6a34058ac251455bb9bf68eeb908a2ef77175507944ba4d4e5" Dec 16 08:47:42 crc kubenswrapper[4823]: I1216 08:47:42.593409 4823 scope.go:117] "RemoveContainer" containerID="d7063c97ce184876ec9b65547d7fb8ba587fc200b68dd522df2c81481dbf07df" Dec 16 08:47:42 crc kubenswrapper[4823]: I1216 08:47:42.651142 4823 scope.go:117] "RemoveContainer" containerID="3e914c35e26a8301b1c2f74bc9e95002ba04447b28cb1f2f82975fd96570dbcd" Dec 16 08:47:42 crc kubenswrapper[4823]: I1216 08:47:42.692359 4823 scope.go:117] "RemoveContainer" containerID="bc65fd721fd7c67bf7529175e89a79673a070d6ec4732f446c3d2585d3fa2364" Dec 16 08:47:42 crc kubenswrapper[4823]: I1216 08:47:42.718986 4823 scope.go:117] "RemoveContainer" containerID="18df81425ceeb2a0cdb9bea5a167be02f2cade8a3185a442bce1fa38256c03b6" Dec 16 08:48:58 crc kubenswrapper[4823]: I1216 08:48:58.134695 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:48:58 crc kubenswrapper[4823]: I1216 08:48:58.135354 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:49:28 crc kubenswrapper[4823]: I1216 08:49:28.134243 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:49:28 crc kubenswrapper[4823]: I1216 08:49:28.134874 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.062241 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 08:49:54 crc kubenswrapper[4823]: E1216 08:49:54.063406 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="registry-server" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.063428 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="registry-server" Dec 16 08:49:54 crc kubenswrapper[4823]: E1216 08:49:54.063462 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="extract-content" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.063474 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="extract-content" Dec 16 08:49:54 crc kubenswrapper[4823]: E1216 08:49:54.063491 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="extract-utilities" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.063507 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="extract-utilities" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.063707 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f827c95e-f78c-4f89-928b-29edf753c8be" containerName="registry-server" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.064538 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.157205 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mcfnl" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.173796 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.204056 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8f2\" (UniqueName: \"kubernetes.io/projected/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7-kube-api-access-vc8f2\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.204255 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13347050-82ad-45f6-9369-708eb586bc9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.306407 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8f2\" (UniqueName: \"kubernetes.io/projected/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7-kube-api-access-vc8f2\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.306517 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13347050-82ad-45f6-9369-708eb586bc9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.310040 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.310087 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13347050-82ad-45f6-9369-708eb586bc9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c376fd464025f3870758026019d1e4112f265c7e978ca6c56c993ae6e7c3f580/globalmount\"" pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.330925 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8f2\" (UniqueName: \"kubernetes.io/projected/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7-kube-api-access-vc8f2\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.342469 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13347050-82ad-45f6-9369-708eb586bc9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") pod \"mariadb-copy-data\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " pod="openstack/mariadb-copy-data" Dec 16 08:49:54 crc kubenswrapper[4823]: I1216 08:49:54.477506 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 08:49:55 crc kubenswrapper[4823]: I1216 08:49:55.051441 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 08:49:55 crc kubenswrapper[4823]: I1216 08:49:55.453931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7","Type":"ContainerStarted","Data":"809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93"} Dec 16 08:49:55 crc kubenswrapper[4823]: I1216 08:49:55.454306 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7","Type":"ContainerStarted","Data":"380427654f4fd9ca64240b353ce082dd27e8be7a481d76582b175c586f4c6e77"} Dec 16 08:49:55 crc kubenswrapper[4823]: I1216 08:49:55.473342 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.473324503 podStartE2EDuration="2.473324503s" podCreationTimestamp="2025-12-16 08:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:49:55.469146632 +0000 UTC m=+6873.957712755" watchObservedRunningTime="2025-12-16 08:49:55.473324503 +0000 UTC m=+6873.961890626" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.134156 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.134599 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.134673 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.135723 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7aa677772e57f6515b9bb17d98c3eab3e0272b84b5b66258a7f5432c5f0835b"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.135824 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://e7aa677772e57f6515b9bb17d98c3eab3e0272b84b5b66258a7f5432c5f0835b" gracePeriod=600 Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.520506 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="e7aa677772e57f6515b9bb17d98c3eab3e0272b84b5b66258a7f5432c5f0835b" exitCode=0 Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.520557 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"e7aa677772e57f6515b9bb17d98c3eab3e0272b84b5b66258a7f5432c5f0835b"} Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.520875 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428"} Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.520896 4823 scope.go:117] "RemoveContainer" containerID="3392da862948c3e1cef11c4a1c08d8880ad4041a534d76052813f8acc165efb4" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.812449 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.815667 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.820175 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.847701 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrpt\" (UniqueName: \"kubernetes.io/projected/767de378-0244-4ae1-8416-df7e82706d48-kube-api-access-mwrpt\") pod \"mariadb-client\" (UID: \"767de378-0244-4ae1-8416-df7e82706d48\") " pod="openstack/mariadb-client" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.948786 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrpt\" (UniqueName: \"kubernetes.io/projected/767de378-0244-4ae1-8416-df7e82706d48-kube-api-access-mwrpt\") pod \"mariadb-client\" (UID: \"767de378-0244-4ae1-8416-df7e82706d48\") " pod="openstack/mariadb-client" Dec 16 08:49:58 crc kubenswrapper[4823]: I1216 08:49:58.978208 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrpt\" (UniqueName: \"kubernetes.io/projected/767de378-0244-4ae1-8416-df7e82706d48-kube-api-access-mwrpt\") pod \"mariadb-client\" (UID: \"767de378-0244-4ae1-8416-df7e82706d48\") " pod="openstack/mariadb-client" Dec 16 08:49:59 crc kubenswrapper[4823]: I1216 08:49:59.175497 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:49:59 crc kubenswrapper[4823]: I1216 08:49:59.686854 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:49:59 crc kubenswrapper[4823]: W1216 08:49:59.700429 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod767de378_0244_4ae1_8416_df7e82706d48.slice/crio-11580ff8921f7c10aff62e3ae031c3cae889f09ed36a62fa07de387095f6a628 WatchSource:0}: Error finding container 11580ff8921f7c10aff62e3ae031c3cae889f09ed36a62fa07de387095f6a628: Status 404 returned error can't find the container with id 11580ff8921f7c10aff62e3ae031c3cae889f09ed36a62fa07de387095f6a628 Dec 16 08:50:00 crc kubenswrapper[4823]: I1216 08:50:00.540782 4823 generic.go:334] "Generic (PLEG): container finished" podID="767de378-0244-4ae1-8416-df7e82706d48" containerID="0249968b43bd996ba4c490b1b172d3964d514a2718cf78c3c4b9728773e2dc5f" exitCode=0 Dec 16 08:50:00 crc kubenswrapper[4823]: I1216 08:50:00.540840 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"767de378-0244-4ae1-8416-df7e82706d48","Type":"ContainerDied","Data":"0249968b43bd996ba4c490b1b172d3964d514a2718cf78c3c4b9728773e2dc5f"} Dec 16 08:50:00 crc kubenswrapper[4823]: I1216 08:50:00.542087 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"767de378-0244-4ae1-8416-df7e82706d48","Type":"ContainerStarted","Data":"11580ff8921f7c10aff62e3ae031c3cae889f09ed36a62fa07de387095f6a628"} Dec 16 08:50:01 crc kubenswrapper[4823]: I1216 08:50:01.910141 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:50:01 crc kubenswrapper[4823]: I1216 08:50:01.953059 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_767de378-0244-4ae1-8416-df7e82706d48/mariadb-client/0.log" Dec 16 08:50:01 crc kubenswrapper[4823]: I1216 08:50:01.984177 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:01 crc kubenswrapper[4823]: I1216 08:50:01.990496 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.015869 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwrpt\" (UniqueName: \"kubernetes.io/projected/767de378-0244-4ae1-8416-df7e82706d48-kube-api-access-mwrpt\") pod \"767de378-0244-4ae1-8416-df7e82706d48\" (UID: \"767de378-0244-4ae1-8416-df7e82706d48\") " Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.024231 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767de378-0244-4ae1-8416-df7e82706d48-kube-api-access-mwrpt" (OuterVolumeSpecName: "kube-api-access-mwrpt") pod "767de378-0244-4ae1-8416-df7e82706d48" (UID: "767de378-0244-4ae1-8416-df7e82706d48"). InnerVolumeSpecName "kube-api-access-mwrpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.109763 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:02 crc kubenswrapper[4823]: E1216 08:50:02.110095 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767de378-0244-4ae1-8416-df7e82706d48" containerName="mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.110107 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="767de378-0244-4ae1-8416-df7e82706d48" containerName="mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.110252 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="767de378-0244-4ae1-8416-df7e82706d48" containerName="mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.110695 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.117481 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwrpt\" (UniqueName: \"kubernetes.io/projected/767de378-0244-4ae1-8416-df7e82706d48-kube-api-access-mwrpt\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.125541 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.218818 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hmzh\" (UniqueName: \"kubernetes.io/projected/11cc0d89-e052-4f9d-9e04-b334941e182f-kube-api-access-9hmzh\") pod \"mariadb-client\" (UID: \"11cc0d89-e052-4f9d-9e04-b334941e182f\") " pod="openstack/mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.320488 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hmzh\" (UniqueName: \"kubernetes.io/projected/11cc0d89-e052-4f9d-9e04-b334941e182f-kube-api-access-9hmzh\") pod \"mariadb-client\" (UID: \"11cc0d89-e052-4f9d-9e04-b334941e182f\") " pod="openstack/mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.351520 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hmzh\" (UniqueName: \"kubernetes.io/projected/11cc0d89-e052-4f9d-9e04-b334941e182f-kube-api-access-9hmzh\") pod \"mariadb-client\" (UID: \"11cc0d89-e052-4f9d-9e04-b334941e182f\") " pod="openstack/mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.473829 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.580246 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11580ff8921f7c10aff62e3ae031c3cae889f09ed36a62fa07de387095f6a628" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.580327 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.599044 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="767de378-0244-4ae1-8416-df7e82706d48" podUID="11cc0d89-e052-4f9d-9e04-b334941e182f" Dec 16 08:50:02 crc kubenswrapper[4823]: I1216 08:50:02.915966 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:03 crc kubenswrapper[4823]: I1216 08:50:03.591234 4823 generic.go:334] "Generic (PLEG): container finished" podID="11cc0d89-e052-4f9d-9e04-b334941e182f" containerID="03659f99df8607265e3bc3e2c161dd22c3a561cec8494c87016a1df74b164e10" exitCode=0 Dec 16 08:50:03 crc kubenswrapper[4823]: I1216 08:50:03.591410 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"11cc0d89-e052-4f9d-9e04-b334941e182f","Type":"ContainerDied","Data":"03659f99df8607265e3bc3e2c161dd22c3a561cec8494c87016a1df74b164e10"} Dec 16 08:50:03 crc kubenswrapper[4823]: I1216 08:50:03.591633 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"11cc0d89-e052-4f9d-9e04-b334941e182f","Type":"ContainerStarted","Data":"2144ba5b2bfa34fc8f51a4e1a536cb841cb0e4b0bce102c0635568fcb48438f6"} Dec 16 08:50:03 crc kubenswrapper[4823]: I1216 08:50:03.785204 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767de378-0244-4ae1-8416-df7e82706d48" path="/var/lib/kubelet/pods/767de378-0244-4ae1-8416-df7e82706d48/volumes" Dec 16 08:50:04 crc kubenswrapper[4823]: I1216 08:50:04.929944 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:50:04 crc kubenswrapper[4823]: I1216 08:50:04.953711 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_11cc0d89-e052-4f9d-9e04-b334941e182f/mariadb-client/0.log" Dec 16 08:50:04 crc kubenswrapper[4823]: I1216 08:50:04.987798 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:04 crc kubenswrapper[4823]: I1216 08:50:04.997703 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:50:05 crc kubenswrapper[4823]: I1216 08:50:05.065522 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hmzh\" (UniqueName: \"kubernetes.io/projected/11cc0d89-e052-4f9d-9e04-b334941e182f-kube-api-access-9hmzh\") pod \"11cc0d89-e052-4f9d-9e04-b334941e182f\" (UID: \"11cc0d89-e052-4f9d-9e04-b334941e182f\") " Dec 16 08:50:05 crc kubenswrapper[4823]: I1216 08:50:05.070803 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cc0d89-e052-4f9d-9e04-b334941e182f-kube-api-access-9hmzh" (OuterVolumeSpecName: "kube-api-access-9hmzh") pod "11cc0d89-e052-4f9d-9e04-b334941e182f" (UID: "11cc0d89-e052-4f9d-9e04-b334941e182f"). InnerVolumeSpecName "kube-api-access-9hmzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:50:05 crc kubenswrapper[4823]: I1216 08:50:05.167099 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hmzh\" (UniqueName: \"kubernetes.io/projected/11cc0d89-e052-4f9d-9e04-b334941e182f-kube-api-access-9hmzh\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:05 crc kubenswrapper[4823]: I1216 08:50:05.615910 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2144ba5b2bfa34fc8f51a4e1a536cb841cb0e4b0bce102c0635568fcb48438f6" Dec 16 08:50:05 crc kubenswrapper[4823]: I1216 08:50:05.615964 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:50:05 crc kubenswrapper[4823]: I1216 08:50:05.780273 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cc0d89-e052-4f9d-9e04-b334941e182f" path="/var/lib/kubelet/pods/11cc0d89-e052-4f9d-9e04-b334941e182f/volumes" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.224558 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 08:50:40 crc kubenswrapper[4823]: E1216 08:50:40.227098 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc0d89-e052-4f9d-9e04-b334941e182f" containerName="mariadb-client" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.227208 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc0d89-e052-4f9d-9e04-b334941e182f" containerName="mariadb-client" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.227741 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc0d89-e052-4f9d-9e04-b334941e182f" containerName="mariadb-client" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.229814 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.235475 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.235780 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.235781 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.235596 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-x7t5x" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.235528 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.261658 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.263180 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.284856 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.286471 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.291873 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.300189 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.308989 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.337834 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.337886 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.337921 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.337942 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8njnm\" (UniqueName: \"kubernetes.io/projected/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-kube-api-access-8njnm\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.337963 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.338004 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.338070 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-config\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.338097 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.439754 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440148 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440294 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440406 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440518 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440627 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8njnm\" (UniqueName: \"kubernetes.io/projected/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-kube-api-access-8njnm\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440749 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440863 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.440973 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441063 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441216 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-config\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441317 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-config\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441390 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441463 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-config\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441539 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441658 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441769 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.441878 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442011 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442248 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442382 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9b2q\" (UniqueName: \"kubernetes.io/projected/dc75b889-6dc5-462d-a589-50f705ffd78f-kube-api-access-f9b2q\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442486 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442609 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rnz\" (UniqueName: \"kubernetes.io/projected/6c99b5e4-de24-426d-9a97-05fdcbe37141-kube-api-access-49rnz\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442726 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442265 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.443097 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.442436 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-config\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.445905 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.445958 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/49ce9d7fce951f664b8ee8e97a386a1ecdc7485ace717e324e9c7a734749116c/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.446705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.447109 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.447179 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.460710 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8njnm\" (UniqueName: \"kubernetes.io/projected/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-kube-api-access-8njnm\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.477100 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") pod \"ovsdbserver-sb-0\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543706 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543769 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543810 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543852 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543880 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543902 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9b2q\" (UniqueName: \"kubernetes.io/projected/dc75b889-6dc5-462d-a589-50f705ffd78f-kube-api-access-f9b2q\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543924 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543966 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49rnz\" (UniqueName: \"kubernetes.io/projected/6c99b5e4-de24-426d-9a97-05fdcbe37141-kube-api-access-49rnz\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.543992 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544052 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544077 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544121 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544165 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544191 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-config\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544212 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-config\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544233 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.544422 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.545728 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.546230 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-config\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.546972 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.547100 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-config\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.549163 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.549478 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.549484 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.550046 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.550078 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a42bb67a36d68dbe42439590fc061ffedfcefc6a5bf93d2d83031a12ea9293d/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.551154 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.551343 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.551354 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.551444 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.551471 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/391ffb98c0d0bf550200c2232922d1a6ce5645892671e61b3b7e42af031b60ef/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.555570 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.565093 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.571711 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rnz\" (UniqueName: \"kubernetes.io/projected/6c99b5e4-de24-426d-9a97-05fdcbe37141-kube-api-access-49rnz\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.579873 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9b2q\" (UniqueName: \"kubernetes.io/projected/dc75b889-6dc5-462d-a589-50f705ffd78f-kube-api-access-f9b2q\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.587091 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") pod \"ovsdbserver-sb-2\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.604727 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") pod \"ovsdbserver-sb-1\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.605224 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:40 crc kubenswrapper[4823]: I1216 08:50:40.885833 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.171943 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 08:50:41 crc kubenswrapper[4823]: W1216 08:50:41.182245 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd65ea3_90ff_4a09_9ae8_3406de2d5ad8.slice/crio-ec6bb6a11a5c2013352dc44d193fe800db2ae48b152fedad7595d139852bee58 WatchSource:0}: Error finding container ec6bb6a11a5c2013352dc44d193fe800db2ae48b152fedad7595d139852bee58: Status 404 returned error can't find the container with id ec6bb6a11a5c2013352dc44d193fe800db2ae48b152fedad7595d139852bee58 Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.184376 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.267207 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 08:50:41 crc kubenswrapper[4823]: W1216 08:50:41.272936 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc75b889_6dc5_462d_a589_50f705ffd78f.slice/crio-6fe56092f831488b7e2b9906b1e4c85664acf1c1917e2265341a547fd64e4cdc WatchSource:0}: Error finding container 6fe56092f831488b7e2b9906b1e4c85664acf1c1917e2265341a547fd64e4cdc: Status 404 returned error can't find the container with id 6fe56092f831488b7e2b9906b1e4c85664acf1c1917e2265341a547fd64e4cdc Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.510783 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 08:50:41 crc kubenswrapper[4823]: W1216 08:50:41.529496 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c99b5e4_de24_426d_9a97_05fdcbe37141.slice/crio-5bed026074a251bc904029bd61e5b94db39ebac6155e0235d43832673edcc33b WatchSource:0}: Error finding container 5bed026074a251bc904029bd61e5b94db39ebac6155e0235d43832673edcc33b: Status 404 returned error can't find the container with id 5bed026074a251bc904029bd61e5b94db39ebac6155e0235d43832673edcc33b Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.894476 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8","Type":"ContainerStarted","Data":"ec6bb6a11a5c2013352dc44d193fe800db2ae48b152fedad7595d139852bee58"} Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.895563 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6c99b5e4-de24-426d-9a97-05fdcbe37141","Type":"ContainerStarted","Data":"5bed026074a251bc904029bd61e5b94db39ebac6155e0235d43832673edcc33b"} Dec 16 08:50:41 crc kubenswrapper[4823]: I1216 08:50:41.896956 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"dc75b889-6dc5-462d-a589-50f705ffd78f","Type":"ContainerStarted","Data":"6fe56092f831488b7e2b9906b1e4c85664acf1c1917e2265341a547fd64e4cdc"} Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.304909 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.306895 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.357729 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tsq4g" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.357990 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.358302 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.358898 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.370113 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372205 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372275 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372305 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372357 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-config\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372385 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372417 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rslzb\" (UniqueName: \"kubernetes.io/projected/05dfc2e3-71af-4150-a4ca-02b5629083ae-kube-api-access-rslzb\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372462 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.372566 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.380917 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.382151 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.387864 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.389690 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.420568 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.434988 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.475997 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476122 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-config\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476171 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-20321045-c94b-408e-81f6-d22070c77447\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476229 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476257 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476346 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476368 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476383 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476434 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476509 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-config\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476554 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476594 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rslzb\" (UniqueName: \"kubernetes.io/projected/05dfc2e3-71af-4150-a4ca-02b5629083ae-kube-api-access-rslzb\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476631 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476656 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjtt\" (UniqueName: \"kubernetes.io/projected/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-kube-api-access-7pjtt\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476713 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.476762 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.477737 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.479196 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.479839 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-config\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.481860 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.481905 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0b4d870f02a361925ed07e9170b5fb7d082ea5173becdb02ed2a005058841083/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.484517 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.499944 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.500199 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.503203 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rslzb\" (UniqueName: \"kubernetes.io/projected/05dfc2e3-71af-4150-a4ca-02b5629083ae-kube-api-access-rslzb\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.532596 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") pod \"ovsdbserver-nb-0\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.578923 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.579579 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjtt\" (UniqueName: \"kubernetes.io/projected/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-kube-api-access-7pjtt\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.579707 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.579826 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.579922 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580038 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580139 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/6353e69a-5c31-41c9-9d05-2b958aa6a79f-kube-api-access-xqfh6\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580280 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580382 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-config\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580473 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-20321045-c94b-408e-81f6-d22070c77447\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580626 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580697 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-config\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580721 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580761 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580793 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.580857 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.581463 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.582595 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.583484 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-config\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.583981 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.584008 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-20321045-c94b-408e-81f6-d22070c77447\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f1bf7516a7e81cff2b3397669f2672545009be64a75a1f1bbb9b8d243f06c9eb/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.584280 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.586791 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.592557 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.596393 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjtt\" (UniqueName: \"kubernetes.io/projected/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-kube-api-access-7pjtt\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.614532 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-20321045-c94b-408e-81f6-d22070c77447\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") pod \"ovsdbserver-nb-1\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.678754 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683371 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683411 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-config\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683452 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683531 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683789 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683832 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683857 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/6353e69a-5c31-41c9-9d05-2b958aa6a79f-kube-api-access-xqfh6\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.683893 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.684346 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.684851 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-config\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.685214 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.695718 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.695718 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.696373 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.696493 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.696515 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8defb8b3ac93cb401485b04314ce355cf7b0abd8bca9f8827de6dd5f47a19c58/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.704401 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/6353e69a-5c31-41c9-9d05-2b958aa6a79f-kube-api-access-xqfh6\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.707526 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:42 crc kubenswrapper[4823]: I1216 08:50:42.733340 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") pod \"ovsdbserver-nb-2\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:43 crc kubenswrapper[4823]: I1216 08:50:43.003662 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:43 crc kubenswrapper[4823]: I1216 08:50:43.221463 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 08:50:43 crc kubenswrapper[4823]: I1216 08:50:43.321180 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 08:50:43 crc kubenswrapper[4823]: I1216 08:50:43.607806 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 08:50:44 crc kubenswrapper[4823]: W1216 08:50:44.649925 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e1d3682_8130_4fa4_aab4_ade2ac069d2e.slice/crio-58a1ed64c1ec667db0652f3f8c47b70f0df19a28f9ba119a3a4a5c12c49c63d1 WatchSource:0}: Error finding container 58a1ed64c1ec667db0652f3f8c47b70f0df19a28f9ba119a3a4a5c12c49c63d1: Status 404 returned error can't find the container with id 58a1ed64c1ec667db0652f3f8c47b70f0df19a28f9ba119a3a4a5c12c49c63d1 Dec 16 08:50:44 crc kubenswrapper[4823]: W1216 08:50:44.652606 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6353e69a_5c31_41c9_9d05_2b958aa6a79f.slice/crio-dcc665d2286cee23a5b5de32e11127b5d6045c3da4ce322ae2938df6e68db2af WatchSource:0}: Error finding container dcc665d2286cee23a5b5de32e11127b5d6045c3da4ce322ae2938df6e68db2af: Status 404 returned error can't find the container with id dcc665d2286cee23a5b5de32e11127b5d6045c3da4ce322ae2938df6e68db2af Dec 16 08:50:44 crc kubenswrapper[4823]: W1216 08:50:44.654324 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05dfc2e3_71af_4150_a4ca_02b5629083ae.slice/crio-dc880561e1a9fedf478eafe6742eadbdd3dc73c314f10d410548e6de4c546ea9 WatchSource:0}: Error finding container dc880561e1a9fedf478eafe6742eadbdd3dc73c314f10d410548e6de4c546ea9: Status 404 returned error can't find the container with id dc880561e1a9fedf478eafe6742eadbdd3dc73c314f10d410548e6de4c546ea9 Dec 16 08:50:44 crc kubenswrapper[4823]: I1216 08:50:44.925880 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"6353e69a-5c31-41c9-9d05-2b958aa6a79f","Type":"ContainerStarted","Data":"dcc665d2286cee23a5b5de32e11127b5d6045c3da4ce322ae2938df6e68db2af"} Dec 16 08:50:44 crc kubenswrapper[4823]: I1216 08:50:44.927704 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7e1d3682-8130-4fa4-aab4-ade2ac069d2e","Type":"ContainerStarted","Data":"58a1ed64c1ec667db0652f3f8c47b70f0df19a28f9ba119a3a4a5c12c49c63d1"} Dec 16 08:50:44 crc kubenswrapper[4823]: I1216 08:50:44.928906 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05dfc2e3-71af-4150-a4ca-02b5629083ae","Type":"ContainerStarted","Data":"dc880561e1a9fedf478eafe6742eadbdd3dc73c314f10d410548e6de4c546ea9"} Dec 16 08:50:45 crc kubenswrapper[4823]: I1216 08:50:45.954395 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8","Type":"ContainerStarted","Data":"81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c"} Dec 16 08:50:45 crc kubenswrapper[4823]: I1216 08:50:45.957579 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6c99b5e4-de24-426d-9a97-05fdcbe37141","Type":"ContainerStarted","Data":"d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b"} Dec 16 08:50:45 crc kubenswrapper[4823]: I1216 08:50:45.960662 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"dc75b889-6dc5-462d-a589-50f705ffd78f","Type":"ContainerStarted","Data":"b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.973267 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7e1d3682-8130-4fa4-aab4-ade2ac069d2e","Type":"ContainerStarted","Data":"bf3c177aa7f060a204b18065f9ace154c320ecfa12044e459e50aad48defa022"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.973634 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7e1d3682-8130-4fa4-aab4-ade2ac069d2e","Type":"ContainerStarted","Data":"2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.976671 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05dfc2e3-71af-4150-a4ca-02b5629083ae","Type":"ContainerStarted","Data":"fa3277346b7569acd2a70b5d918fc5eff0d9ac222f0e8c16aa4ecaef31ed032b"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.976717 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05dfc2e3-71af-4150-a4ca-02b5629083ae","Type":"ContainerStarted","Data":"066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.978964 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"6353e69a-5c31-41c9-9d05-2b958aa6a79f","Type":"ContainerStarted","Data":"b324eb19678c78b1a0e6949df42fbbb0f0364093edcb56f5eafa24f8062aff0e"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.978990 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"6353e69a-5c31-41c9-9d05-2b958aa6a79f","Type":"ContainerStarted","Data":"10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.981084 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6c99b5e4-de24-426d-9a97-05fdcbe37141","Type":"ContainerStarted","Data":"94d634a132c1bc025be0422b3756b78f98c01350c4a640621b04c9ead2558605"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.983204 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"dc75b889-6dc5-462d-a589-50f705ffd78f","Type":"ContainerStarted","Data":"6ffc44659c4af61247f8ff482db564b84d825032d9d7789049d8414a5dd9a687"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.985265 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8","Type":"ContainerStarted","Data":"0c5789344ac78b72b59aa20ebab8bdaa03ec7af24691842901db5f6dd86d3f14"} Dec 16 08:50:46 crc kubenswrapper[4823]: I1216 08:50:46.993476 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.569725506 podStartE2EDuration="5.993461925s" podCreationTimestamp="2025-12-16 08:50:41 +0000 UTC" firstStartedPulling="2025-12-16 08:50:44.653960178 +0000 UTC m=+6923.142526311" lastFinishedPulling="2025-12-16 08:50:46.077696607 +0000 UTC m=+6924.566262730" observedRunningTime="2025-12-16 08:50:46.987712325 +0000 UTC m=+6925.476278448" watchObservedRunningTime="2025-12-16 08:50:46.993461925 +0000 UTC m=+6925.482028048" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.017010 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.211483253 podStartE2EDuration="8.016986892s" podCreationTimestamp="2025-12-16 08:50:39 +0000 UTC" firstStartedPulling="2025-12-16 08:50:41.275258489 +0000 UTC m=+6919.763824612" lastFinishedPulling="2025-12-16 08:50:45.080762118 +0000 UTC m=+6923.569328251" observedRunningTime="2025-12-16 08:50:47.011345245 +0000 UTC m=+6925.499911368" watchObservedRunningTime="2025-12-16 08:50:47.016986892 +0000 UTC m=+6925.505553015" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.030728 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.641991908 podStartE2EDuration="6.030711062s" podCreationTimestamp="2025-12-16 08:50:41 +0000 UTC" firstStartedPulling="2025-12-16 08:50:44.658060926 +0000 UTC m=+6923.146627049" lastFinishedPulling="2025-12-16 08:50:46.04678008 +0000 UTC m=+6924.535346203" observedRunningTime="2025-12-16 08:50:47.030129293 +0000 UTC m=+6925.518695416" watchObservedRunningTime="2025-12-16 08:50:47.030711062 +0000 UTC m=+6925.519277195" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.055119 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.493916024 podStartE2EDuration="8.055093935s" podCreationTimestamp="2025-12-16 08:50:39 +0000 UTC" firstStartedPulling="2025-12-16 08:50:41.531778349 +0000 UTC m=+6920.020344472" lastFinishedPulling="2025-12-16 08:50:45.09295627 +0000 UTC m=+6923.581522383" observedRunningTime="2025-12-16 08:50:47.053374831 +0000 UTC m=+6925.541940984" watchObservedRunningTime="2025-12-16 08:50:47.055093935 +0000 UTC m=+6925.543660108" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.078902 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.106930779 podStartE2EDuration="8.078880199s" podCreationTimestamp="2025-12-16 08:50:39 +0000 UTC" firstStartedPulling="2025-12-16 08:50:41.184107515 +0000 UTC m=+6919.672673638" lastFinishedPulling="2025-12-16 08:50:45.156056935 +0000 UTC m=+6923.644623058" observedRunningTime="2025-12-16 08:50:47.077557578 +0000 UTC m=+6925.566123701" watchObservedRunningTime="2025-12-16 08:50:47.078880199 +0000 UTC m=+6925.567446322" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.101229 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.706355722 podStartE2EDuration="6.101208298s" podCreationTimestamp="2025-12-16 08:50:41 +0000 UTC" firstStartedPulling="2025-12-16 08:50:44.655365401 +0000 UTC m=+6923.143931524" lastFinishedPulling="2025-12-16 08:50:46.050217977 +0000 UTC m=+6924.538784100" observedRunningTime="2025-12-16 08:50:47.096637345 +0000 UTC m=+6925.585203478" watchObservedRunningTime="2025-12-16 08:50:47.101208298 +0000 UTC m=+6925.589774421" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.679994 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:47 crc kubenswrapper[4823]: I1216 08:50:47.708156 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:48 crc kubenswrapper[4823]: I1216 08:50:48.004300 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:48 crc kubenswrapper[4823]: I1216 08:50:48.679053 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:48 crc kubenswrapper[4823]: I1216 08:50:48.708875 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.005074 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.056777 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.566095 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.606076 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.631961 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.668374 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.886573 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:49 crc kubenswrapper[4823]: I1216 08:50:49.938541 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.014894 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.014980 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.015010 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.082206 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.093232 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.093509 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.343993 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c8659b7c-4wq7n"] Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.345238 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.348147 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.352088 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8659b7c-4wq7n"] Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.427283 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-dns-svc\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.427342 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-ovsdbserver-sb\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.427367 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-config\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.427441 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rkvc\" (UniqueName: \"kubernetes.io/projected/13ee260e-1deb-4739-b738-4e8087135c50-kube-api-access-4rkvc\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.529019 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-ovsdbserver-sb\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.529086 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-config\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.529168 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rkvc\" (UniqueName: \"kubernetes.io/projected/13ee260e-1deb-4739-b738-4e8087135c50-kube-api-access-4rkvc\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.529225 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-dns-svc\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.530671 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-dns-svc\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.530852 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-ovsdbserver-sb\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.531140 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-config\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.555621 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rkvc\" (UniqueName: \"kubernetes.io/projected/13ee260e-1deb-4739-b738-4e8087135c50-kube-api-access-4rkvc\") pod \"dnsmasq-dns-c8659b7c-4wq7n\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:50 crc kubenswrapper[4823]: I1216 08:50:50.664954 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.056439 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 16 08:50:51 crc kubenswrapper[4823]: W1216 08:50:51.123227 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13ee260e_1deb_4739_b738_4e8087135c50.slice/crio-17ac3da30364310c4c38ee442d3aa847a2f8ddb2e586fb02ead75ca52cb59e0b WatchSource:0}: Error finding container 17ac3da30364310c4c38ee442d3aa847a2f8ddb2e586fb02ead75ca52cb59e0b: Status 404 returned error can't find the container with id 17ac3da30364310c4c38ee442d3aa847a2f8ddb2e586fb02ead75ca52cb59e0b Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.126937 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c8659b7c-4wq7n"] Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.400533 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8659b7c-4wq7n"] Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.413865 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b59595b79-vnhhg"] Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.415204 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.420496 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.500348 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b59595b79-vnhhg"] Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.558070 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-sb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.558175 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-dns-svc\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.558202 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47rgb\" (UniqueName: \"kubernetes.io/projected/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-kube-api-access-47rgb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.558239 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-nb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.558253 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-config\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.660330 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-dns-svc\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.660413 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47rgb\" (UniqueName: \"kubernetes.io/projected/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-kube-api-access-47rgb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.660468 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-nb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.660489 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-config\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.660531 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-sb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.661548 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-dns-svc\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.661670 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-config\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.661900 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-sb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.662081 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-nb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.680192 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47rgb\" (UniqueName: \"kubernetes.io/projected/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-kube-api-access-47rgb\") pod \"dnsmasq-dns-7b59595b79-vnhhg\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.724747 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.749565 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.769655 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.788070 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 08:50:51 crc kubenswrapper[4823]: I1216 08:50:51.797741 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 16 08:50:52 crc kubenswrapper[4823]: I1216 08:50:52.030958 4823 generic.go:334] "Generic (PLEG): container finished" podID="13ee260e-1deb-4739-b738-4e8087135c50" containerID="0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6" exitCode=0 Dec 16 08:50:52 crc kubenswrapper[4823]: I1216 08:50:52.032072 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" event={"ID":"13ee260e-1deb-4739-b738-4e8087135c50","Type":"ContainerDied","Data":"0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6"} Dec 16 08:50:52 crc kubenswrapper[4823]: I1216 08:50:52.032171 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" event={"ID":"13ee260e-1deb-4739-b738-4e8087135c50","Type":"ContainerStarted","Data":"17ac3da30364310c4c38ee442d3aa847a2f8ddb2e586fb02ead75ca52cb59e0b"} Dec 16 08:50:52 crc kubenswrapper[4823]: I1216 08:50:52.302493 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b59595b79-vnhhg"] Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.046391 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" event={"ID":"13ee260e-1deb-4739-b738-4e8087135c50","Type":"ContainerStarted","Data":"58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394"} Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.047149 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" podUID="13ee260e-1deb-4739-b738-4e8087135c50" containerName="dnsmasq-dns" containerID="cri-o://58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394" gracePeriod=10 Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.047236 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.049700 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" event={"ID":"16894c6b-6fe2-41ae-a89f-67a0c0b3710c","Type":"ContainerDied","Data":"c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5"} Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.050911 4823 generic.go:334] "Generic (PLEG): container finished" podID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerID="c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5" exitCode=0 Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.051000 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" event={"ID":"16894c6b-6fe2-41ae-a89f-67a0c0b3710c","Type":"ContainerStarted","Data":"c00d47d7a3f834c3b133b663d103b7e1649997b2bc12e299d7fbfdb87f162d57"} Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.082356 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" podStartSLOduration=3.082331366 podStartE2EDuration="3.082331366s" podCreationTimestamp="2025-12-16 08:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:50:53.069447723 +0000 UTC m=+6931.558013876" watchObservedRunningTime="2025-12-16 08:50:53.082331366 +0000 UTC m=+6931.570897499" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.513964 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.599577 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rkvc\" (UniqueName: \"kubernetes.io/projected/13ee260e-1deb-4739-b738-4e8087135c50-kube-api-access-4rkvc\") pod \"13ee260e-1deb-4739-b738-4e8087135c50\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.599669 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-ovsdbserver-sb\") pod \"13ee260e-1deb-4739-b738-4e8087135c50\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.599708 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-dns-svc\") pod \"13ee260e-1deb-4739-b738-4e8087135c50\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.599822 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-config\") pod \"13ee260e-1deb-4739-b738-4e8087135c50\" (UID: \"13ee260e-1deb-4739-b738-4e8087135c50\") " Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.604814 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ee260e-1deb-4739-b738-4e8087135c50-kube-api-access-4rkvc" (OuterVolumeSpecName: "kube-api-access-4rkvc") pod "13ee260e-1deb-4739-b738-4e8087135c50" (UID: "13ee260e-1deb-4739-b738-4e8087135c50"). InnerVolumeSpecName "kube-api-access-4rkvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.635935 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13ee260e-1deb-4739-b738-4e8087135c50" (UID: "13ee260e-1deb-4739-b738-4e8087135c50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.641703 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13ee260e-1deb-4739-b738-4e8087135c50" (UID: "13ee260e-1deb-4739-b738-4e8087135c50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.643571 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-config" (OuterVolumeSpecName: "config") pod "13ee260e-1deb-4739-b738-4e8087135c50" (UID: "13ee260e-1deb-4739-b738-4e8087135c50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.701712 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.701739 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rkvc\" (UniqueName: \"kubernetes.io/projected/13ee260e-1deb-4739-b738-4e8087135c50-kube-api-access-4rkvc\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.701751 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.701760 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ee260e-1deb-4739-b738-4e8087135c50-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.833981 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j2wmf"] Dec 16 08:50:53 crc kubenswrapper[4823]: E1216 08:50:53.835201 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ee260e-1deb-4739-b738-4e8087135c50" containerName="dnsmasq-dns" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.835289 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ee260e-1deb-4739-b738-4e8087135c50" containerName="dnsmasq-dns" Dec 16 08:50:53 crc kubenswrapper[4823]: E1216 08:50:53.835359 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ee260e-1deb-4739-b738-4e8087135c50" containerName="init" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.835415 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ee260e-1deb-4739-b738-4e8087135c50" containerName="init" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.835650 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ee260e-1deb-4739-b738-4e8087135c50" containerName="dnsmasq-dns" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.837972 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.847333 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2wmf"] Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.904080 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62gws\" (UniqueName: \"kubernetes.io/projected/66d23cee-1793-4b9b-b1ca-df73dc0037aa-kube-api-access-62gws\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.904184 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-utilities\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:53 crc kubenswrapper[4823]: I1216 08:50:53.904266 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-catalog-content\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.005632 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-catalog-content\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.005769 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62gws\" (UniqueName: \"kubernetes.io/projected/66d23cee-1793-4b9b-b1ca-df73dc0037aa-kube-api-access-62gws\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.005855 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-utilities\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.006280 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-catalog-content\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.006447 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-utilities\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.025409 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62gws\" (UniqueName: \"kubernetes.io/projected/66d23cee-1793-4b9b-b1ca-df73dc0037aa-kube-api-access-62gws\") pod \"certified-operators-j2wmf\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.060579 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" event={"ID":"16894c6b-6fe2-41ae-a89f-67a0c0b3710c","Type":"ContainerStarted","Data":"f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31"} Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.060715 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.062505 4823 generic.go:334] "Generic (PLEG): container finished" podID="13ee260e-1deb-4739-b738-4e8087135c50" containerID="58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394" exitCode=0 Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.062549 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" event={"ID":"13ee260e-1deb-4739-b738-4e8087135c50","Type":"ContainerDied","Data":"58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394"} Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.062576 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" event={"ID":"13ee260e-1deb-4739-b738-4e8087135c50","Type":"ContainerDied","Data":"17ac3da30364310c4c38ee442d3aa847a2f8ddb2e586fb02ead75ca52cb59e0b"} Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.062593 4823 scope.go:117] "RemoveContainer" containerID="58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.062739 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c8659b7c-4wq7n" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.089374 4823 scope.go:117] "RemoveContainer" containerID="0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.102046 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" podStartSLOduration=3.102009508 podStartE2EDuration="3.102009508s" podCreationTimestamp="2025-12-16 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:50:54.083935141 +0000 UTC m=+6932.572501264" watchObservedRunningTime="2025-12-16 08:50:54.102009508 +0000 UTC m=+6932.590575631" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.105854 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c8659b7c-4wq7n"] Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.109495 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c8659b7c-4wq7n"] Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.112615 4823 scope.go:117] "RemoveContainer" containerID="58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394" Dec 16 08:50:54 crc kubenswrapper[4823]: E1216 08:50:54.112919 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394\": container with ID starting with 58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394 not found: ID does not exist" containerID="58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.112952 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394"} err="failed to get container status \"58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394\": rpc error: code = NotFound desc = could not find container \"58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394\": container with ID starting with 58d2e873b2e4f1dc37b0a5e286aa446d69704cd5f97b00807ae0a9ca3e0f3394 not found: ID does not exist" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.112976 4823 scope.go:117] "RemoveContainer" containerID="0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6" Dec 16 08:50:54 crc kubenswrapper[4823]: E1216 08:50:54.113183 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6\": container with ID starting with 0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6 not found: ID does not exist" containerID="0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.113202 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6"} err="failed to get container status \"0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6\": rpc error: code = NotFound desc = could not find container \"0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6\": container with ID starting with 0078c2c097f44f7bb440a3852e9557c185dd88b2977b33d7f9290f7ebb4e06f6 not found: ID does not exist" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.155124 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.588494 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2wmf"] Dec 16 08:50:54 crc kubenswrapper[4823]: W1216 08:50:54.591138 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66d23cee_1793_4b9b_b1ca_df73dc0037aa.slice/crio-2a23d9ca15ddad7c8963249f72ce4e07e2899d80b52cef15d87f01a357ef7f94 WatchSource:0}: Error finding container 2a23d9ca15ddad7c8963249f72ce4e07e2899d80b52cef15d87f01a357ef7f94: Status 404 returned error can't find the container with id 2a23d9ca15ddad7c8963249f72ce4e07e2899d80b52cef15d87f01a357ef7f94 Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.759008 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.760234 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.762507 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.765783 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.818168 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvbpg\" (UniqueName: \"kubernetes.io/projected/b416f746-16ad-4f74-b315-f67ca3d0bb35-kube-api-access-fvbpg\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.818266 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.818322 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b416f746-16ad-4f74-b315-f67ca3d0bb35-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.919672 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvbpg\" (UniqueName: \"kubernetes.io/projected/b416f746-16ad-4f74-b315-f67ca3d0bb35-kube-api-access-fvbpg\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.919746 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.919792 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b416f746-16ad-4f74-b315-f67ca3d0bb35-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.923702 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.923751 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c463b53b5c39d91ef5f9ebd2004be580616b9af69d542851a3c910dfb53c8d06/globalmount\"" pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.926416 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b416f746-16ad-4f74-b315-f67ca3d0bb35-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.940540 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvbpg\" (UniqueName: \"kubernetes.io/projected/b416f746-16ad-4f74-b315-f67ca3d0bb35-kube-api-access-fvbpg\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:54 crc kubenswrapper[4823]: I1216 08:50:54.954106 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") pod \"ovn-copy-data\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " pod="openstack/ovn-copy-data" Dec 16 08:50:55 crc kubenswrapper[4823]: I1216 08:50:55.071385 4823 generic.go:334] "Generic (PLEG): container finished" podID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerID="a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2" exitCode=0 Dec 16 08:50:55 crc kubenswrapper[4823]: I1216 08:50:55.071440 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerDied","Data":"a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2"} Dec 16 08:50:55 crc kubenswrapper[4823]: I1216 08:50:55.071501 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerStarted","Data":"2a23d9ca15ddad7c8963249f72ce4e07e2899d80b52cef15d87f01a357ef7f94"} Dec 16 08:50:55 crc kubenswrapper[4823]: I1216 08:50:55.077680 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 08:50:55 crc kubenswrapper[4823]: I1216 08:50:55.562487 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 08:50:55 crc kubenswrapper[4823]: I1216 08:50:55.787575 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ee260e-1deb-4739-b738-4e8087135c50" path="/var/lib/kubelet/pods/13ee260e-1deb-4739-b738-4e8087135c50/volumes" Dec 16 08:50:56 crc kubenswrapper[4823]: I1216 08:50:56.079354 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerStarted","Data":"89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe"} Dec 16 08:50:56 crc kubenswrapper[4823]: I1216 08:50:56.081216 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"b416f746-16ad-4f74-b315-f67ca3d0bb35","Type":"ContainerStarted","Data":"01385a59cb3d569d4bca785b2dde8a6fb841f70317cca67d92c68009862c7ab6"} Dec 16 08:50:56 crc kubenswrapper[4823]: I1216 08:50:56.081292 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"b416f746-16ad-4f74-b315-f67ca3d0bb35","Type":"ContainerStarted","Data":"c61d96840ac48c4b80c9fe7848ce298b20b61c12c4aaf1281dc79fd3f2c264f0"} Dec 16 08:50:56 crc kubenswrapper[4823]: I1216 08:50:56.129676 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.93147453 podStartE2EDuration="3.129655783s" podCreationTimestamp="2025-12-16 08:50:53 +0000 UTC" firstStartedPulling="2025-12-16 08:50:55.569325162 +0000 UTC m=+6934.057891285" lastFinishedPulling="2025-12-16 08:50:55.767506395 +0000 UTC m=+6934.256072538" observedRunningTime="2025-12-16 08:50:56.124335095 +0000 UTC m=+6934.612901228" watchObservedRunningTime="2025-12-16 08:50:56.129655783 +0000 UTC m=+6934.618221906" Dec 16 08:50:57 crc kubenswrapper[4823]: I1216 08:50:57.090013 4823 generic.go:334] "Generic (PLEG): container finished" podID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerID="89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe" exitCode=0 Dec 16 08:50:57 crc kubenswrapper[4823]: I1216 08:50:57.090118 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerDied","Data":"89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe"} Dec 16 08:50:59 crc kubenswrapper[4823]: I1216 08:50:59.111747 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerStarted","Data":"bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83"} Dec 16 08:50:59 crc kubenswrapper[4823]: I1216 08:50:59.133544 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j2wmf" podStartSLOduration=3.257392932 podStartE2EDuration="6.133524659s" podCreationTimestamp="2025-12-16 08:50:53 +0000 UTC" firstStartedPulling="2025-12-16 08:50:55.073396817 +0000 UTC m=+6933.561962930" lastFinishedPulling="2025-12-16 08:50:57.949528534 +0000 UTC m=+6936.438094657" observedRunningTime="2025-12-16 08:50:59.130215335 +0000 UTC m=+6937.618781448" watchObservedRunningTime="2025-12-16 08:50:59.133524659 +0000 UTC m=+6937.622090782" Dec 16 08:51:01 crc kubenswrapper[4823]: I1216 08:51:01.781934 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:51:01 crc kubenswrapper[4823]: I1216 08:51:01.844461 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-62nfz"] Dec 16 08:51:01 crc kubenswrapper[4823]: I1216 08:51:01.845198 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="dnsmasq-dns" containerID="cri-o://99ade9bb130226e2e9065e905ac5bf166a1e5dc8ccad306acdfb239b54e73335" gracePeriod=10 Dec 16 08:51:02 crc kubenswrapper[4823]: I1216 08:51:02.029588 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.250:5353: connect: connection refused" Dec 16 08:51:02 crc kubenswrapper[4823]: I1216 08:51:02.135509 4823 generic.go:334] "Generic (PLEG): container finished" podID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerID="99ade9bb130226e2e9065e905ac5bf166a1e5dc8ccad306acdfb239b54e73335" exitCode=0 Dec 16 08:51:02 crc kubenswrapper[4823]: I1216 08:51:02.135553 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" event={"ID":"4042708f-2c2b-4c71-adb8-c8c9a12c3284","Type":"ContainerDied","Data":"99ade9bb130226e2e9065e905ac5bf166a1e5dc8ccad306acdfb239b54e73335"} Dec 16 08:51:02 crc kubenswrapper[4823]: I1216 08:51:02.952563 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.062445 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-dns-svc\") pod \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.062502 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-config\") pod \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.062567 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l66b\" (UniqueName: \"kubernetes.io/projected/4042708f-2c2b-4c71-adb8-c8c9a12c3284-kube-api-access-8l66b\") pod \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\" (UID: \"4042708f-2c2b-4c71-adb8-c8c9a12c3284\") " Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.069450 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4042708f-2c2b-4c71-adb8-c8c9a12c3284-kube-api-access-8l66b" (OuterVolumeSpecName: "kube-api-access-8l66b") pod "4042708f-2c2b-4c71-adb8-c8c9a12c3284" (UID: "4042708f-2c2b-4c71-adb8-c8c9a12c3284"). InnerVolumeSpecName "kube-api-access-8l66b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.110661 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-config" (OuterVolumeSpecName: "config") pod "4042708f-2c2b-4c71-adb8-c8c9a12c3284" (UID: "4042708f-2c2b-4c71-adb8-c8c9a12c3284"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.117110 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4042708f-2c2b-4c71-adb8-c8c9a12c3284" (UID: "4042708f-2c2b-4c71-adb8-c8c9a12c3284"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.151908 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" event={"ID":"4042708f-2c2b-4c71-adb8-c8c9a12c3284","Type":"ContainerDied","Data":"72755a305d84714f5026b4124d1492bd95a45e33fe4637363b02d8dda5287ad7"} Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.152252 4823 scope.go:117] "RemoveContainer" containerID="99ade9bb130226e2e9065e905ac5bf166a1e5dc8ccad306acdfb239b54e73335" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.151961 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-62nfz" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.164891 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.165219 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4042708f-2c2b-4c71-adb8-c8c9a12c3284-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.165229 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l66b\" (UniqueName: \"kubernetes.io/projected/4042708f-2c2b-4c71-adb8-c8c9a12c3284-kube-api-access-8l66b\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.181322 4823 scope.go:117] "RemoveContainer" containerID="055c62ef347cd361e5bfe2a423fe2a269cb09bd4337b3aa1f7ab1e229c9dc996" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.193735 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-62nfz"] Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.202494 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-62nfz"] Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.782273 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" path="/var/lib/kubelet/pods/4042708f-2c2b-4c71-adb8-c8c9a12c3284/volumes" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.872971 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5rr5w"] Dec 16 08:51:03 crc kubenswrapper[4823]: E1216 08:51:03.873755 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="init" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.873777 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="init" Dec 16 08:51:03 crc kubenswrapper[4823]: E1216 08:51:03.873805 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="dnsmasq-dns" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.873816 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="dnsmasq-dns" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.874083 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4042708f-2c2b-4c71-adb8-c8c9a12c3284" containerName="dnsmasq-dns" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.878288 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.884897 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5rr5w"] Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.982771 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-utilities\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.983098 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6pp\" (UniqueName: \"kubernetes.io/projected/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-kube-api-access-kh6pp\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:03 crc kubenswrapper[4823]: I1216 08:51:03.983306 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-catalog-content\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.084616 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-catalog-content\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.084744 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-utilities\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.084837 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6pp\" (UniqueName: \"kubernetes.io/projected/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-kube-api-access-kh6pp\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.085378 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-catalog-content\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.085401 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-utilities\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.108339 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6pp\" (UniqueName: \"kubernetes.io/projected/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-kube-api-access-kh6pp\") pod \"community-operators-5rr5w\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.155208 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.155264 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.200515 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.211587 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.322359 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.323970 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.330387 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.330556 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ptv87" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.330592 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.331384 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.331529 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393467 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-config\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393636 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393739 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393808 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-scripts\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393889 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393929 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.393977 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xhmz\" (UniqueName: \"kubernetes.io/projected/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-kube-api-access-8xhmz\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.494908 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.494951 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.494982 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xhmz\" (UniqueName: \"kubernetes.io/projected/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-kube-api-access-8xhmz\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.495013 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-config\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.495051 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.495107 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.495148 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-scripts\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.496231 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-scripts\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.497826 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-config\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.516146 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.517564 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.520947 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.524763 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xhmz\" (UniqueName: \"kubernetes.io/projected/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-kube-api-access-8xhmz\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.532866 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.678765 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 08:51:04 crc kubenswrapper[4823]: I1216 08:51:04.828938 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5rr5w"] Dec 16 08:51:05 crc kubenswrapper[4823]: I1216 08:51:05.180925 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerStarted","Data":"18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7"} Dec 16 08:51:05 crc kubenswrapper[4823]: I1216 08:51:05.181178 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerStarted","Data":"c8ef4c0b6dd990b858304e555f846a73338858d6d16cd5e44ab6290e1ce95e5f"} Dec 16 08:51:05 crc kubenswrapper[4823]: I1216 08:51:05.187588 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 08:51:05 crc kubenswrapper[4823]: W1216 08:51:05.226251 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64445002_15b9_4ec6_8c95_7c2bd33e0ecd.slice/crio-e71013427b8ec529c188d0f556be5db5037f0ea0074d98f08688944c40eeec32 WatchSource:0}: Error finding container e71013427b8ec529c188d0f556be5db5037f0ea0074d98f08688944c40eeec32: Status 404 returned error can't find the container with id e71013427b8ec529c188d0f556be5db5037f0ea0074d98f08688944c40eeec32 Dec 16 08:51:05 crc kubenswrapper[4823]: I1216 08:51:05.228838 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:51:06 crc kubenswrapper[4823]: I1216 08:51:06.187576 4823 generic.go:334] "Generic (PLEG): container finished" podID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerID="18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7" exitCode=0 Dec 16 08:51:06 crc kubenswrapper[4823]: I1216 08:51:06.187648 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerDied","Data":"18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7"} Dec 16 08:51:06 crc kubenswrapper[4823]: I1216 08:51:06.189348 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"64445002-15b9-4ec6-8c95-7c2bd33e0ecd","Type":"ContainerStarted","Data":"e71013427b8ec529c188d0f556be5db5037f0ea0074d98f08688944c40eeec32"} Dec 16 08:51:06 crc kubenswrapper[4823]: I1216 08:51:06.450804 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2wmf"] Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.197817 4823 generic.go:334] "Generic (PLEG): container finished" podID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerID="7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f" exitCode=0 Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.197881 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerDied","Data":"7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f"} Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.201590 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"64445002-15b9-4ec6-8c95-7c2bd33e0ecd","Type":"ContainerStarted","Data":"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca"} Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.201651 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"64445002-15b9-4ec6-8c95-7c2bd33e0ecd","Type":"ContainerStarted","Data":"d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523"} Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.201743 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j2wmf" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="registry-server" containerID="cri-o://bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83" gracePeriod=2 Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.642940 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.671938 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.894337381 podStartE2EDuration="3.671912312s" podCreationTimestamp="2025-12-16 08:51:04 +0000 UTC" firstStartedPulling="2025-12-16 08:51:05.230168494 +0000 UTC m=+6943.718734617" lastFinishedPulling="2025-12-16 08:51:06.007743415 +0000 UTC m=+6944.496309548" observedRunningTime="2025-12-16 08:51:07.246562437 +0000 UTC m=+6945.735128560" watchObservedRunningTime="2025-12-16 08:51:07.671912312 +0000 UTC m=+6946.160478445" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.764919 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-utilities\") pod \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.765080 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62gws\" (UniqueName: \"kubernetes.io/projected/66d23cee-1793-4b9b-b1ca-df73dc0037aa-kube-api-access-62gws\") pod \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.765193 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-catalog-content\") pod \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\" (UID: \"66d23cee-1793-4b9b-b1ca-df73dc0037aa\") " Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.766792 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-utilities" (OuterVolumeSpecName: "utilities") pod "66d23cee-1793-4b9b-b1ca-df73dc0037aa" (UID: "66d23cee-1793-4b9b-b1ca-df73dc0037aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.771326 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d23cee-1793-4b9b-b1ca-df73dc0037aa-kube-api-access-62gws" (OuterVolumeSpecName: "kube-api-access-62gws") pod "66d23cee-1793-4b9b-b1ca-df73dc0037aa" (UID: "66d23cee-1793-4b9b-b1ca-df73dc0037aa"). InnerVolumeSpecName "kube-api-access-62gws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.821320 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66d23cee-1793-4b9b-b1ca-df73dc0037aa" (UID: "66d23cee-1793-4b9b-b1ca-df73dc0037aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.869314 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.869370 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d23cee-1793-4b9b-b1ca-df73dc0037aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:07 crc kubenswrapper[4823]: I1216 08:51:07.869383 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62gws\" (UniqueName: \"kubernetes.io/projected/66d23cee-1793-4b9b-b1ca-df73dc0037aa-kube-api-access-62gws\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.210351 4823 generic.go:334] "Generic (PLEG): container finished" podID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerID="bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83" exitCode=0 Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.210414 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerDied","Data":"bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83"} Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.210444 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2wmf" event={"ID":"66d23cee-1793-4b9b-b1ca-df73dc0037aa","Type":"ContainerDied","Data":"2a23d9ca15ddad7c8963249f72ce4e07e2899d80b52cef15d87f01a357ef7f94"} Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.210461 4823 scope.go:117] "RemoveContainer" containerID="bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.211199 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2wmf" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.213077 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerStarted","Data":"12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994"} Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.213179 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.233401 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5rr5w" podStartSLOduration=2.769348414 podStartE2EDuration="5.233380269s" podCreationTimestamp="2025-12-16 08:51:03 +0000 UTC" firstStartedPulling="2025-12-16 08:51:05.183235695 +0000 UTC m=+6943.671801838" lastFinishedPulling="2025-12-16 08:51:07.64726756 +0000 UTC m=+6946.135833693" observedRunningTime="2025-12-16 08:51:08.232688907 +0000 UTC m=+6946.721255050" watchObservedRunningTime="2025-12-16 08:51:08.233380269 +0000 UTC m=+6946.721946392" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.236643 4823 scope.go:117] "RemoveContainer" containerID="89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.254072 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2wmf"] Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.260328 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j2wmf"] Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.272004 4823 scope.go:117] "RemoveContainer" containerID="a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.290132 4823 scope.go:117] "RemoveContainer" containerID="bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83" Dec 16 08:51:08 crc kubenswrapper[4823]: E1216 08:51:08.291198 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83\": container with ID starting with bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83 not found: ID does not exist" containerID="bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.291241 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83"} err="failed to get container status \"bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83\": rpc error: code = NotFound desc = could not find container \"bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83\": container with ID starting with bab6f466d970f99d325b0f094cb48cc87dc57fcf2dc7306107a7995105a23e83 not found: ID does not exist" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.291269 4823 scope.go:117] "RemoveContainer" containerID="89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe" Dec 16 08:51:08 crc kubenswrapper[4823]: E1216 08:51:08.291548 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe\": container with ID starting with 89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe not found: ID does not exist" containerID="89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.291572 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe"} err="failed to get container status \"89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe\": rpc error: code = NotFound desc = could not find container \"89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe\": container with ID starting with 89ed551e87fdf3bb6ecd0557f0bceb8e48a8c31f28056f43a32232bb6fb3abfe not found: ID does not exist" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.291585 4823 scope.go:117] "RemoveContainer" containerID="a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2" Dec 16 08:51:08 crc kubenswrapper[4823]: E1216 08:51:08.292013 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2\": container with ID starting with a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2 not found: ID does not exist" containerID="a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2" Dec 16 08:51:08 crc kubenswrapper[4823]: I1216 08:51:08.292078 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2"} err="failed to get container status \"a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2\": rpc error: code = NotFound desc = could not find container \"a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2\": container with ID starting with a46889f03f1f51668c4ed58ecc376776cd02e1652876a99f6212994ed195a0a2 not found: ID does not exist" Dec 16 08:51:09 crc kubenswrapper[4823]: I1216 08:51:09.780842 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" path="/var/lib/kubelet/pods/66d23cee-1793-4b9b-b1ca-df73dc0037aa/volumes" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.104521 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jthq4"] Dec 16 08:51:12 crc kubenswrapper[4823]: E1216 08:51:12.106109 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="registry-server" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.106202 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="registry-server" Dec 16 08:51:12 crc kubenswrapper[4823]: E1216 08:51:12.106299 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="extract-utilities" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.106379 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="extract-utilities" Dec 16 08:51:12 crc kubenswrapper[4823]: E1216 08:51:12.106441 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="extract-content" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.106505 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="extract-content" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.106724 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d23cee-1793-4b9b-b1ca-df73dc0037aa" containerName="registry-server" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.107379 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.112353 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jthq4"] Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.138991 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wr82\" (UniqueName: \"kubernetes.io/projected/1954da4f-6645-49fa-89b4-1e3f2c0284b2-kube-api-access-7wr82\") pod \"keystone-db-create-jthq4\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.139310 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1954da4f-6645-49fa-89b4-1e3f2c0284b2-operator-scripts\") pod \"keystone-db-create-jthq4\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.175009 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b1da-account-create-update-qx8k4"] Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.176079 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.184352 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b1da-account-create-update-qx8k4"] Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.193401 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.240783 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-operator-scripts\") pod \"keystone-b1da-account-create-update-qx8k4\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.241067 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqg6f\" (UniqueName: \"kubernetes.io/projected/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-kube-api-access-pqg6f\") pod \"keystone-b1da-account-create-update-qx8k4\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.241168 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wr82\" (UniqueName: \"kubernetes.io/projected/1954da4f-6645-49fa-89b4-1e3f2c0284b2-kube-api-access-7wr82\") pod \"keystone-db-create-jthq4\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.241280 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1954da4f-6645-49fa-89b4-1e3f2c0284b2-operator-scripts\") pod \"keystone-db-create-jthq4\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.249425 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1954da4f-6645-49fa-89b4-1e3f2c0284b2-operator-scripts\") pod \"keystone-db-create-jthq4\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.271616 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wr82\" (UniqueName: \"kubernetes.io/projected/1954da4f-6645-49fa-89b4-1e3f2c0284b2-kube-api-access-7wr82\") pod \"keystone-db-create-jthq4\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.343079 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-operator-scripts\") pod \"keystone-b1da-account-create-update-qx8k4\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.343148 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqg6f\" (UniqueName: \"kubernetes.io/projected/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-kube-api-access-pqg6f\") pod \"keystone-b1da-account-create-update-qx8k4\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.343928 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-operator-scripts\") pod \"keystone-b1da-account-create-update-qx8k4\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.358801 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqg6f\" (UniqueName: \"kubernetes.io/projected/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-kube-api-access-pqg6f\") pod \"keystone-b1da-account-create-update-qx8k4\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.446348 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:12 crc kubenswrapper[4823]: I1216 08:51:12.496659 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:13 crc kubenswrapper[4823]: I1216 08:51:13.011175 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jthq4"] Dec 16 08:51:13 crc kubenswrapper[4823]: W1216 08:51:13.029341 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1954da4f_6645_49fa_89b4_1e3f2c0284b2.slice/crio-5deea188059f17acf2b040c756fec62e9f3528e57aac3efd1325d8a9a3b7c304 WatchSource:0}: Error finding container 5deea188059f17acf2b040c756fec62e9f3528e57aac3efd1325d8a9a3b7c304: Status 404 returned error can't find the container with id 5deea188059f17acf2b040c756fec62e9f3528e57aac3efd1325d8a9a3b7c304 Dec 16 08:51:13 crc kubenswrapper[4823]: I1216 08:51:13.069739 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b1da-account-create-update-qx8k4"] Dec 16 08:51:13 crc kubenswrapper[4823]: W1216 08:51:13.075335 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17b9ddf1_8b90_4e7d_8edf_09b5d2b2cece.slice/crio-2dc8e2522d270eb18c59bd33b6cae1cedf848ebaec8864072fa4b87b50a8370b WatchSource:0}: Error finding container 2dc8e2522d270eb18c59bd33b6cae1cedf848ebaec8864072fa4b87b50a8370b: Status 404 returned error can't find the container with id 2dc8e2522d270eb18c59bd33b6cae1cedf848ebaec8864072fa4b87b50a8370b Dec 16 08:51:13 crc kubenswrapper[4823]: I1216 08:51:13.262467 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jthq4" event={"ID":"1954da4f-6645-49fa-89b4-1e3f2c0284b2","Type":"ContainerStarted","Data":"5deea188059f17acf2b040c756fec62e9f3528e57aac3efd1325d8a9a3b7c304"} Dec 16 08:51:13 crc kubenswrapper[4823]: I1216 08:51:13.265123 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b1da-account-create-update-qx8k4" event={"ID":"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece","Type":"ContainerStarted","Data":"2dc8e2522d270eb18c59bd33b6cae1cedf848ebaec8864072fa4b87b50a8370b"} Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.202109 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.202520 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.268111 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.276998 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jthq4" event={"ID":"1954da4f-6645-49fa-89b4-1e3f2c0284b2","Type":"ContainerStarted","Data":"4ec0f7c2e134442b675d08bc3e4fc7f0885c233878469d1da39df7e1df12e44f"} Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.279971 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b1da-account-create-update-qx8k4" event={"ID":"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece","Type":"ContainerStarted","Data":"26c7db70f6316543ff12cfb682c93e0642226e2bffa92bcddd444bcecc4ea3f0"} Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.311887 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b1da-account-create-update-qx8k4" podStartSLOduration=2.311863556 podStartE2EDuration="2.311863556s" podCreationTimestamp="2025-12-16 08:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:51:14.304938878 +0000 UTC m=+6952.793505021" watchObservedRunningTime="2025-12-16 08:51:14.311863556 +0000 UTC m=+6952.800429719" Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.330416 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-jthq4" podStartSLOduration=2.330392635 podStartE2EDuration="2.330392635s" podCreationTimestamp="2025-12-16 08:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:51:14.319136533 +0000 UTC m=+6952.807702666" watchObservedRunningTime="2025-12-16 08:51:14.330392635 +0000 UTC m=+6952.818958768" Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.347846 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:14 crc kubenswrapper[4823]: I1216 08:51:14.503159 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5rr5w"] Dec 16 08:51:15 crc kubenswrapper[4823]: I1216 08:51:15.293489 4823 generic.go:334] "Generic (PLEG): container finished" podID="1954da4f-6645-49fa-89b4-1e3f2c0284b2" containerID="4ec0f7c2e134442b675d08bc3e4fc7f0885c233878469d1da39df7e1df12e44f" exitCode=0 Dec 16 08:51:15 crc kubenswrapper[4823]: I1216 08:51:15.294470 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jthq4" event={"ID":"1954da4f-6645-49fa-89b4-1e3f2c0284b2","Type":"ContainerDied","Data":"4ec0f7c2e134442b675d08bc3e4fc7f0885c233878469d1da39df7e1df12e44f"} Dec 16 08:51:15 crc kubenswrapper[4823]: I1216 08:51:15.296355 4823 generic.go:334] "Generic (PLEG): container finished" podID="17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" containerID="26c7db70f6316543ff12cfb682c93e0642226e2bffa92bcddd444bcecc4ea3f0" exitCode=0 Dec 16 08:51:15 crc kubenswrapper[4823]: I1216 08:51:15.296451 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b1da-account-create-update-qx8k4" event={"ID":"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece","Type":"ContainerDied","Data":"26c7db70f6316543ff12cfb682c93e0642226e2bffa92bcddd444bcecc4ea3f0"} Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.306878 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5rr5w" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="registry-server" containerID="cri-o://12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994" gracePeriod=2 Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.794978 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.800275 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.809883 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.865875 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1954da4f-6645-49fa-89b4-1e3f2c0284b2-operator-scripts\") pod \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.865966 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-operator-scripts\") pod \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.866010 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wr82\" (UniqueName: \"kubernetes.io/projected/1954da4f-6645-49fa-89b4-1e3f2c0284b2-kube-api-access-7wr82\") pod \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\" (UID: \"1954da4f-6645-49fa-89b4-1e3f2c0284b2\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.866707 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqg6f\" (UniqueName: \"kubernetes.io/projected/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-kube-api-access-pqg6f\") pod \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\" (UID: \"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.866959 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" (UID: "17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.867291 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1954da4f-6645-49fa-89b4-1e3f2c0284b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1954da4f-6645-49fa-89b4-1e3f2c0284b2" (UID: "1954da4f-6645-49fa-89b4-1e3f2c0284b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.867466 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1954da4f-6645-49fa-89b4-1e3f2c0284b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.867482 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.873212 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-kube-api-access-pqg6f" (OuterVolumeSpecName: "kube-api-access-pqg6f") pod "17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" (UID: "17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece"). InnerVolumeSpecName "kube-api-access-pqg6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.873245 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1954da4f-6645-49fa-89b4-1e3f2c0284b2-kube-api-access-7wr82" (OuterVolumeSpecName: "kube-api-access-7wr82") pod "1954da4f-6645-49fa-89b4-1e3f2c0284b2" (UID: "1954da4f-6645-49fa-89b4-1e3f2c0284b2"). InnerVolumeSpecName "kube-api-access-7wr82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.968705 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-catalog-content\") pod \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.968808 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh6pp\" (UniqueName: \"kubernetes.io/projected/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-kube-api-access-kh6pp\") pod \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.968968 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-utilities\") pod \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\" (UID: \"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c\") " Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.969281 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wr82\" (UniqueName: \"kubernetes.io/projected/1954da4f-6645-49fa-89b4-1e3f2c0284b2-kube-api-access-7wr82\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.969300 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqg6f\" (UniqueName: \"kubernetes.io/projected/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece-kube-api-access-pqg6f\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.969681 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-utilities" (OuterVolumeSpecName: "utilities") pod "ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" (UID: "ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:51:16 crc kubenswrapper[4823]: I1216 08:51:16.975204 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-kube-api-access-kh6pp" (OuterVolumeSpecName: "kube-api-access-kh6pp") pod "ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" (UID: "ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c"). InnerVolumeSpecName "kube-api-access-kh6pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.015975 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" (UID: "ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.071007 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.071040 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.071068 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh6pp\" (UniqueName: \"kubernetes.io/projected/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c-kube-api-access-kh6pp\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.317794 4823 generic.go:334] "Generic (PLEG): container finished" podID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerID="12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994" exitCode=0 Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.317852 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerDied","Data":"12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994"} Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.317888 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rr5w" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.317921 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rr5w" event={"ID":"ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c","Type":"ContainerDied","Data":"c8ef4c0b6dd990b858304e555f846a73338858d6d16cd5e44ab6290e1ce95e5f"} Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.317949 4823 scope.go:117] "RemoveContainer" containerID="12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.319609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jthq4" event={"ID":"1954da4f-6645-49fa-89b4-1e3f2c0284b2","Type":"ContainerDied","Data":"5deea188059f17acf2b040c756fec62e9f3528e57aac3efd1325d8a9a3b7c304"} Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.319634 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5deea188059f17acf2b040c756fec62e9f3528e57aac3efd1325d8a9a3b7c304" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.319690 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jthq4" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.322770 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b1da-account-create-update-qx8k4" event={"ID":"17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece","Type":"ContainerDied","Data":"2dc8e2522d270eb18c59bd33b6cae1cedf848ebaec8864072fa4b87b50a8370b"} Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.322822 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dc8e2522d270eb18c59bd33b6cae1cedf848ebaec8864072fa4b87b50a8370b" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.322891 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b1da-account-create-update-qx8k4" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.347493 4823 scope.go:117] "RemoveContainer" containerID="7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.376773 4823 scope.go:117] "RemoveContainer" containerID="18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.381250 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5rr5w"] Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.388613 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5rr5w"] Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.400765 4823 scope.go:117] "RemoveContainer" containerID="12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994" Dec 16 08:51:17 crc kubenswrapper[4823]: E1216 08:51:17.401314 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994\": container with ID starting with 12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994 not found: ID does not exist" containerID="12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.401359 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994"} err="failed to get container status \"12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994\": rpc error: code = NotFound desc = could not find container \"12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994\": container with ID starting with 12cba11024c89b28e22ce7989eec125e4152de7a50535c26c64a45517eacb994 not found: ID does not exist" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.401386 4823 scope.go:117] "RemoveContainer" containerID="7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f" Dec 16 08:51:17 crc kubenswrapper[4823]: E1216 08:51:17.401689 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f\": container with ID starting with 7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f not found: ID does not exist" containerID="7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.401777 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f"} err="failed to get container status \"7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f\": rpc error: code = NotFound desc = could not find container \"7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f\": container with ID starting with 7b07a81c08b9b38532419194b5fe40891b48867a4db00e3cb0fb61f1b814356f not found: ID does not exist" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.401853 4823 scope.go:117] "RemoveContainer" containerID="18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7" Dec 16 08:51:17 crc kubenswrapper[4823]: E1216 08:51:17.402455 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7\": container with ID starting with 18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7 not found: ID does not exist" containerID="18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.402531 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7"} err="failed to get container status \"18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7\": rpc error: code = NotFound desc = could not find container \"18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7\": container with ID starting with 18a01e21746cb2466dd7929996540d8d5aae3ddc6595d3ced217a914154ce4f7 not found: ID does not exist" Dec 16 08:51:17 crc kubenswrapper[4823]: I1216 08:51:17.800644 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" path="/var/lib/kubelet/pods/ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c/volumes" Dec 16 08:51:19 crc kubenswrapper[4823]: I1216 08:51:19.788734 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.737815 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xqn74"] Dec 16 08:51:22 crc kubenswrapper[4823]: E1216 08:51:22.738572 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="extract-utilities" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738589 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="extract-utilities" Dec 16 08:51:22 crc kubenswrapper[4823]: E1216 08:51:22.738601 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" containerName="mariadb-account-create-update" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738619 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" containerName="mariadb-account-create-update" Dec 16 08:51:22 crc kubenswrapper[4823]: E1216 08:51:22.738635 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1954da4f-6645-49fa-89b4-1e3f2c0284b2" containerName="mariadb-database-create" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738643 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1954da4f-6645-49fa-89b4-1e3f2c0284b2" containerName="mariadb-database-create" Dec 16 08:51:22 crc kubenswrapper[4823]: E1216 08:51:22.738660 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="extract-content" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738669 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="extract-content" Dec 16 08:51:22 crc kubenswrapper[4823]: E1216 08:51:22.738683 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="registry-server" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738693 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="registry-server" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738895 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5f70d7-f0d1-4cac-805f-6a04ef9e6b6c" containerName="registry-server" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738918 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1954da4f-6645-49fa-89b4-1e3f2c0284b2" containerName="mariadb-database-create" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.738933 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" containerName="mariadb-account-create-update" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.739743 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.743898 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.747506 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.747938 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.748541 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzvsn" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.779954 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xqn74"] Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.884095 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-combined-ca-bundle\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.884399 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-config-data\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.884608 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nzcm\" (UniqueName: \"kubernetes.io/projected/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-kube-api-access-2nzcm\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.986940 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nzcm\" (UniqueName: \"kubernetes.io/projected/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-kube-api-access-2nzcm\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.987112 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-combined-ca-bundle\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.987183 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-config-data\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.994694 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-combined-ca-bundle\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:22 crc kubenswrapper[4823]: I1216 08:51:22.994871 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-config-data\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:23 crc kubenswrapper[4823]: I1216 08:51:23.007888 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nzcm\" (UniqueName: \"kubernetes.io/projected/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-kube-api-access-2nzcm\") pod \"keystone-db-sync-xqn74\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:23 crc kubenswrapper[4823]: I1216 08:51:23.064337 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:23 crc kubenswrapper[4823]: I1216 08:51:23.519136 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xqn74"] Dec 16 08:51:23 crc kubenswrapper[4823]: W1216 08:51:23.528386 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e373e7_9d6c_44b2_86c5_3ed6e2c5b6d8.slice/crio-377fb48945018f46ff50aae286df8ea7376d45fb70fabe63462162e2ed29df30 WatchSource:0}: Error finding container 377fb48945018f46ff50aae286df8ea7376d45fb70fabe63462162e2ed29df30: Status 404 returned error can't find the container with id 377fb48945018f46ff50aae286df8ea7376d45fb70fabe63462162e2ed29df30 Dec 16 08:51:24 crc kubenswrapper[4823]: I1216 08:51:24.389494 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqn74" event={"ID":"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8","Type":"ContainerStarted","Data":"377fb48945018f46ff50aae286df8ea7376d45fb70fabe63462162e2ed29df30"} Dec 16 08:51:29 crc kubenswrapper[4823]: I1216 08:51:29.462156 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqn74" event={"ID":"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8","Type":"ContainerStarted","Data":"5414e83a96e344b0a96036db40557c25edcf33693ffa4f01634717a9f4f1781f"} Dec 16 08:51:29 crc kubenswrapper[4823]: I1216 08:51:29.505459 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xqn74" podStartSLOduration=2.647320996 podStartE2EDuration="7.505433599s" podCreationTimestamp="2025-12-16 08:51:22 +0000 UTC" firstStartedPulling="2025-12-16 08:51:23.532055192 +0000 UTC m=+6962.020621305" lastFinishedPulling="2025-12-16 08:51:28.390167795 +0000 UTC m=+6966.878733908" observedRunningTime="2025-12-16 08:51:29.494130865 +0000 UTC m=+6967.982697018" watchObservedRunningTime="2025-12-16 08:51:29.505433599 +0000 UTC m=+6967.993999752" Dec 16 08:51:30 crc kubenswrapper[4823]: I1216 08:51:30.471082 4823 generic.go:334] "Generic (PLEG): container finished" podID="43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" containerID="5414e83a96e344b0a96036db40557c25edcf33693ffa4f01634717a9f4f1781f" exitCode=0 Dec 16 08:51:30 crc kubenswrapper[4823]: I1216 08:51:30.471158 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqn74" event={"ID":"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8","Type":"ContainerDied","Data":"5414e83a96e344b0a96036db40557c25edcf33693ffa4f01634717a9f4f1781f"} Dec 16 08:51:31 crc kubenswrapper[4823]: I1216 08:51:31.869282 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:31 crc kubenswrapper[4823]: I1216 08:51:31.946559 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-config-data\") pod \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " Dec 16 08:51:31 crc kubenswrapper[4823]: I1216 08:51:31.946594 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-combined-ca-bundle\") pod \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " Dec 16 08:51:31 crc kubenswrapper[4823]: I1216 08:51:31.946616 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nzcm\" (UniqueName: \"kubernetes.io/projected/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-kube-api-access-2nzcm\") pod \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\" (UID: \"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8\") " Dec 16 08:51:31 crc kubenswrapper[4823]: I1216 08:51:31.957345 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-kube-api-access-2nzcm" (OuterVolumeSpecName: "kube-api-access-2nzcm") pod "43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" (UID: "43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8"). InnerVolumeSpecName "kube-api-access-2nzcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:31 crc kubenswrapper[4823]: I1216 08:51:31.975936 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" (UID: "43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.002828 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-config-data" (OuterVolumeSpecName: "config-data") pod "43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" (UID: "43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.048943 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.048985 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.049002 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nzcm\" (UniqueName: \"kubernetes.io/projected/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8-kube-api-access-2nzcm\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.492886 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xqn74" event={"ID":"43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8","Type":"ContainerDied","Data":"377fb48945018f46ff50aae286df8ea7376d45fb70fabe63462162e2ed29df30"} Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.492945 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="377fb48945018f46ff50aae286df8ea7376d45fb70fabe63462162e2ed29df30" Dec 16 08:51:32 crc kubenswrapper[4823]: I1216 08:51:32.492983 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xqn74" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.164761 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7855b89597-4wv7t"] Dec 16 08:51:33 crc kubenswrapper[4823]: E1216 08:51:33.167165 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" containerName="keystone-db-sync" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.167189 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" containerName="keystone-db-sync" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.167367 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" containerName="keystone-db-sync" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.168288 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.173611 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7rxp4"] Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.175185 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.177293 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.184055 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7855b89597-4wv7t"] Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.186401 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.186471 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.186704 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzvsn" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.186869 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.201835 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7rxp4"] Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286063 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-combined-ca-bundle\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286109 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-dns-svc\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286157 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-sb\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286226 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-config\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286252 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-credential-keys\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286271 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5tr\" (UniqueName: \"kubernetes.io/projected/2a8c54db-60c1-4615-855d-33153bc4970d-kube-api-access-vl5tr\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286310 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-fernet-keys\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286329 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-config-data\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286348 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2cwg\" (UniqueName: \"kubernetes.io/projected/0a93b032-6107-4c47-afea-459d1ac4b399-kube-api-access-b2cwg\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286737 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-scripts\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.286850 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-nb\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388219 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-scripts\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388278 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-nb\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388320 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-combined-ca-bundle\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388346 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-dns-svc\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388415 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-sb\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388570 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-config\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388597 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-credential-keys\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388649 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5tr\" (UniqueName: \"kubernetes.io/projected/2a8c54db-60c1-4615-855d-33153bc4970d-kube-api-access-vl5tr\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388783 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-fernet-keys\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.388810 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-config-data\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.389377 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-nb\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.389391 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-sb\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.389704 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-dns-svc\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.389757 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2cwg\" (UniqueName: \"kubernetes.io/projected/0a93b032-6107-4c47-afea-459d1ac4b399-kube-api-access-b2cwg\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.389942 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-config\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.394403 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-scripts\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.394584 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-credential-keys\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.394716 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-combined-ca-bundle\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.400417 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-fernet-keys\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.407700 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2cwg\" (UniqueName: \"kubernetes.io/projected/0a93b032-6107-4c47-afea-459d1ac4b399-kube-api-access-b2cwg\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.408053 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5tr\" (UniqueName: \"kubernetes.io/projected/2a8c54db-60c1-4615-855d-33153bc4970d-kube-api-access-vl5tr\") pod \"dnsmasq-dns-7855b89597-4wv7t\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.408583 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-config-data\") pod \"keystone-bootstrap-7rxp4\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.491317 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.497885 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:33 crc kubenswrapper[4823]: I1216 08:51:33.945353 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7855b89597-4wv7t"] Dec 16 08:51:34 crc kubenswrapper[4823]: W1216 08:51:34.049421 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a93b032_6107_4c47_afea_459d1ac4b399.slice/crio-028ff750b3857b914bcdc28ea57856626b7d5081fa586608552f2282903d0c81 WatchSource:0}: Error finding container 028ff750b3857b914bcdc28ea57856626b7d5081fa586608552f2282903d0c81: Status 404 returned error can't find the container with id 028ff750b3857b914bcdc28ea57856626b7d5081fa586608552f2282903d0c81 Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.051798 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7rxp4"] Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.508280 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a8c54db-60c1-4615-855d-33153bc4970d" containerID="1b60411fa736044173574aac51a7401327f9d71fdb73fa6ddff846144843b50a" exitCode=0 Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.508332 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" event={"ID":"2a8c54db-60c1-4615-855d-33153bc4970d","Type":"ContainerDied","Data":"1b60411fa736044173574aac51a7401327f9d71fdb73fa6ddff846144843b50a"} Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.508382 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" event={"ID":"2a8c54db-60c1-4615-855d-33153bc4970d","Type":"ContainerStarted","Data":"971237a37a276a8fac544db84ad4632992ebcc0d8a26d2189c5ca3ec044a4753"} Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.509496 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7rxp4" event={"ID":"0a93b032-6107-4c47-afea-459d1ac4b399","Type":"ContainerStarted","Data":"68ec7a465b0ad6b461f30e4849b6ddb746160aec1ea230537f172432c1a9d391"} Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.509523 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7rxp4" event={"ID":"0a93b032-6107-4c47-afea-459d1ac4b399","Type":"ContainerStarted","Data":"028ff750b3857b914bcdc28ea57856626b7d5081fa586608552f2282903d0c81"} Dec 16 08:51:34 crc kubenswrapper[4823]: I1216 08:51:34.563874 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7rxp4" podStartSLOduration=1.563850221 podStartE2EDuration="1.563850221s" podCreationTimestamp="2025-12-16 08:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:51:34.551801334 +0000 UTC m=+6973.040367467" watchObservedRunningTime="2025-12-16 08:51:34.563850221 +0000 UTC m=+6973.052416354" Dec 16 08:51:35 crc kubenswrapper[4823]: I1216 08:51:35.525942 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" event={"ID":"2a8c54db-60c1-4615-855d-33153bc4970d","Type":"ContainerStarted","Data":"29b42b9afc72e4c0bd5c3085cbb2f6431cb628d9a171429a96e9c632452b8f65"} Dec 16 08:51:35 crc kubenswrapper[4823]: I1216 08:51:35.558137 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" podStartSLOduration=2.558119677 podStartE2EDuration="2.558119677s" podCreationTimestamp="2025-12-16 08:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:51:35.551434427 +0000 UTC m=+6974.040000550" watchObservedRunningTime="2025-12-16 08:51:35.558119677 +0000 UTC m=+6974.046685790" Dec 16 08:51:36 crc kubenswrapper[4823]: I1216 08:51:36.535431 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:38 crc kubenswrapper[4823]: I1216 08:51:38.555621 4823 generic.go:334] "Generic (PLEG): container finished" podID="0a93b032-6107-4c47-afea-459d1ac4b399" containerID="68ec7a465b0ad6b461f30e4849b6ddb746160aec1ea230537f172432c1a9d391" exitCode=0 Dec 16 08:51:38 crc kubenswrapper[4823]: I1216 08:51:38.555707 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7rxp4" event={"ID":"0a93b032-6107-4c47-afea-459d1ac4b399","Type":"ContainerDied","Data":"68ec7a465b0ad6b461f30e4849b6ddb746160aec1ea230537f172432c1a9d391"} Dec 16 08:51:39 crc kubenswrapper[4823]: I1216 08:51:39.961193 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.118076 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-credential-keys\") pod \"0a93b032-6107-4c47-afea-459d1ac4b399\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.118377 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-config-data\") pod \"0a93b032-6107-4c47-afea-459d1ac4b399\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.118459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2cwg\" (UniqueName: \"kubernetes.io/projected/0a93b032-6107-4c47-afea-459d1ac4b399-kube-api-access-b2cwg\") pod \"0a93b032-6107-4c47-afea-459d1ac4b399\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.118505 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-fernet-keys\") pod \"0a93b032-6107-4c47-afea-459d1ac4b399\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.118570 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-scripts\") pod \"0a93b032-6107-4c47-afea-459d1ac4b399\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.118601 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-combined-ca-bundle\") pod \"0a93b032-6107-4c47-afea-459d1ac4b399\" (UID: \"0a93b032-6107-4c47-afea-459d1ac4b399\") " Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.125515 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0a93b032-6107-4c47-afea-459d1ac4b399" (UID: "0a93b032-6107-4c47-afea-459d1ac4b399"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.125600 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0a93b032-6107-4c47-afea-459d1ac4b399" (UID: "0a93b032-6107-4c47-afea-459d1ac4b399"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.125844 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-scripts" (OuterVolumeSpecName: "scripts") pod "0a93b032-6107-4c47-afea-459d1ac4b399" (UID: "0a93b032-6107-4c47-afea-459d1ac4b399"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.127155 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a93b032-6107-4c47-afea-459d1ac4b399-kube-api-access-b2cwg" (OuterVolumeSpecName: "kube-api-access-b2cwg") pod "0a93b032-6107-4c47-afea-459d1ac4b399" (UID: "0a93b032-6107-4c47-afea-459d1ac4b399"). InnerVolumeSpecName "kube-api-access-b2cwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.153132 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a93b032-6107-4c47-afea-459d1ac4b399" (UID: "0a93b032-6107-4c47-afea-459d1ac4b399"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.153570 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-config-data" (OuterVolumeSpecName: "config-data") pod "0a93b032-6107-4c47-afea-459d1ac4b399" (UID: "0a93b032-6107-4c47-afea-459d1ac4b399"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.221374 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.221445 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2cwg\" (UniqueName: \"kubernetes.io/projected/0a93b032-6107-4c47-afea-459d1ac4b399-kube-api-access-b2cwg\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.221461 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.221473 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.221485 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.221500 4823 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a93b032-6107-4c47-afea-459d1ac4b399-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.579358 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7rxp4" event={"ID":"0a93b032-6107-4c47-afea-459d1ac4b399","Type":"ContainerDied","Data":"028ff750b3857b914bcdc28ea57856626b7d5081fa586608552f2282903d0c81"} Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.579402 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="028ff750b3857b914bcdc28ea57856626b7d5081fa586608552f2282903d0c81" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.579532 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7rxp4" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.675206 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7rxp4"] Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.681790 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7rxp4"] Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.757436 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tjbg9"] Dec 16 08:51:40 crc kubenswrapper[4823]: E1216 08:51:40.761150 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a93b032-6107-4c47-afea-459d1ac4b399" containerName="keystone-bootstrap" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.761174 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a93b032-6107-4c47-afea-459d1ac4b399" containerName="keystone-bootstrap" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.761386 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a93b032-6107-4c47-afea-459d1ac4b399" containerName="keystone-bootstrap" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.762245 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.764480 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.764774 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.765103 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.765174 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.765285 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzvsn" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.769617 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tjbg9"] Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.832631 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-scripts\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.832713 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24njh\" (UniqueName: \"kubernetes.io/projected/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-kube-api-access-24njh\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.832754 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-config-data\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.832782 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-credential-keys\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.832866 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-fernet-keys\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.832898 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-combined-ca-bundle\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.934166 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24njh\" (UniqueName: \"kubernetes.io/projected/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-kube-api-access-24njh\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.934527 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-config-data\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.934651 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-credential-keys\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.934813 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-fernet-keys\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.934918 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-combined-ca-bundle\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.935136 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-scripts\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.942245 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-combined-ca-bundle\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.942320 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-config-data\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.942430 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-scripts\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.943215 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-credential-keys\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.946142 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-fernet-keys\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:40 crc kubenswrapper[4823]: I1216 08:51:40.955100 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24njh\" (UniqueName: \"kubernetes.io/projected/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-kube-api-access-24njh\") pod \"keystone-bootstrap-tjbg9\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:41 crc kubenswrapper[4823]: I1216 08:51:41.091134 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:41 crc kubenswrapper[4823]: I1216 08:51:41.540271 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tjbg9"] Dec 16 08:51:41 crc kubenswrapper[4823]: I1216 08:51:41.589894 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tjbg9" event={"ID":"fb6dd2ee-cc9d-4aed-994a-59021ba71f47","Type":"ContainerStarted","Data":"52a67e04341c8ae9b9db51409719d2665b49b8cd4927fa566097e07394bde095"} Dec 16 08:51:41 crc kubenswrapper[4823]: I1216 08:51:41.784359 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a93b032-6107-4c47-afea-459d1ac4b399" path="/var/lib/kubelet/pods/0a93b032-6107-4c47-afea-459d1ac4b399/volumes" Dec 16 08:51:42 crc kubenswrapper[4823]: I1216 08:51:42.600104 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tjbg9" event={"ID":"fb6dd2ee-cc9d-4aed-994a-59021ba71f47","Type":"ContainerStarted","Data":"a76d0259f26ac7f214ff187762c1d93800cea4b144dae321378712aba87a7fd6"} Dec 16 08:51:42 crc kubenswrapper[4823]: I1216 08:51:42.628991 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tjbg9" podStartSLOduration=2.628971419 podStartE2EDuration="2.628971419s" podCreationTimestamp="2025-12-16 08:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:51:42.624256292 +0000 UTC m=+6981.112822435" watchObservedRunningTime="2025-12-16 08:51:42.628971419 +0000 UTC m=+6981.117537542" Dec 16 08:51:43 crc kubenswrapper[4823]: I1216 08:51:43.493020 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:51:43 crc kubenswrapper[4823]: I1216 08:51:43.569900 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b59595b79-vnhhg"] Dec 16 08:51:43 crc kubenswrapper[4823]: I1216 08:51:43.570565 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerName="dnsmasq-dns" containerID="cri-o://f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31" gracePeriod=10 Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.033850 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.133654 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-dns-svc\") pod \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.133760 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47rgb\" (UniqueName: \"kubernetes.io/projected/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-kube-api-access-47rgb\") pod \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.133795 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-config\") pod \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.133818 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-nb\") pod \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.133853 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-sb\") pod \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\" (UID: \"16894c6b-6fe2-41ae-a89f-67a0c0b3710c\") " Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.139006 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-kube-api-access-47rgb" (OuterVolumeSpecName: "kube-api-access-47rgb") pod "16894c6b-6fe2-41ae-a89f-67a0c0b3710c" (UID: "16894c6b-6fe2-41ae-a89f-67a0c0b3710c"). InnerVolumeSpecName "kube-api-access-47rgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.177623 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16894c6b-6fe2-41ae-a89f-67a0c0b3710c" (UID: "16894c6b-6fe2-41ae-a89f-67a0c0b3710c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.180895 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16894c6b-6fe2-41ae-a89f-67a0c0b3710c" (UID: "16894c6b-6fe2-41ae-a89f-67a0c0b3710c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.188608 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-config" (OuterVolumeSpecName: "config") pod "16894c6b-6fe2-41ae-a89f-67a0c0b3710c" (UID: "16894c6b-6fe2-41ae-a89f-67a0c0b3710c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.193914 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16894c6b-6fe2-41ae-a89f-67a0c0b3710c" (UID: "16894c6b-6fe2-41ae-a89f-67a0c0b3710c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.235613 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.235655 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.235670 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47rgb\" (UniqueName: \"kubernetes.io/projected/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-kube-api-access-47rgb\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.235685 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.235698 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16894c6b-6fe2-41ae-a89f-67a0c0b3710c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.627598 4823 generic.go:334] "Generic (PLEG): container finished" podID="fb6dd2ee-cc9d-4aed-994a-59021ba71f47" containerID="a76d0259f26ac7f214ff187762c1d93800cea4b144dae321378712aba87a7fd6" exitCode=0 Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.627704 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tjbg9" event={"ID":"fb6dd2ee-cc9d-4aed-994a-59021ba71f47","Type":"ContainerDied","Data":"a76d0259f26ac7f214ff187762c1d93800cea4b144dae321378712aba87a7fd6"} Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.631518 4823 generic.go:334] "Generic (PLEG): container finished" podID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerID="f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31" exitCode=0 Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.631613 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" event={"ID":"16894c6b-6fe2-41ae-a89f-67a0c0b3710c","Type":"ContainerDied","Data":"f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31"} Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.631675 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" event={"ID":"16894c6b-6fe2-41ae-a89f-67a0c0b3710c","Type":"ContainerDied","Data":"c00d47d7a3f834c3b133b663d103b7e1649997b2bc12e299d7fbfdb87f162d57"} Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.631713 4823 scope.go:117] "RemoveContainer" containerID="f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.631631 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b59595b79-vnhhg" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.671695 4823 scope.go:117] "RemoveContainer" containerID="c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.691718 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b59595b79-vnhhg"] Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.699219 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b59595b79-vnhhg"] Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.754095 4823 scope.go:117] "RemoveContainer" containerID="f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31" Dec 16 08:51:44 crc kubenswrapper[4823]: E1216 08:51:44.754832 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31\": container with ID starting with f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31 not found: ID does not exist" containerID="f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.754873 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31"} err="failed to get container status \"f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31\": rpc error: code = NotFound desc = could not find container \"f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31\": container with ID starting with f51fce565aaf95b20db22b6d993baf2ce631cde0839d86c257c1600106bede31 not found: ID does not exist" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.754899 4823 scope.go:117] "RemoveContainer" containerID="c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5" Dec 16 08:51:44 crc kubenswrapper[4823]: E1216 08:51:44.755567 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5\": container with ID starting with c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5 not found: ID does not exist" containerID="c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5" Dec 16 08:51:44 crc kubenswrapper[4823]: I1216 08:51:44.755594 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5"} err="failed to get container status \"c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5\": rpc error: code = NotFound desc = could not find container \"c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5\": container with ID starting with c8203acb195c1d9e612d2e0aec7a7c8f859251f9dd746e26af05ac08486fdaf5 not found: ID does not exist" Dec 16 08:51:45 crc kubenswrapper[4823]: I1216 08:51:45.791018 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" path="/var/lib/kubelet/pods/16894c6b-6fe2-41ae-a89f-67a0c0b3710c/volumes" Dec 16 08:51:45 crc kubenswrapper[4823]: I1216 08:51:45.969708 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.071066 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-combined-ca-bundle\") pod \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.071211 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-fernet-keys\") pod \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.071250 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-credential-keys\") pod \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.071267 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-scripts\") pod \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.071287 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24njh\" (UniqueName: \"kubernetes.io/projected/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-kube-api-access-24njh\") pod \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.071323 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-config-data\") pod \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\" (UID: \"fb6dd2ee-cc9d-4aed-994a-59021ba71f47\") " Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.077919 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fb6dd2ee-cc9d-4aed-994a-59021ba71f47" (UID: "fb6dd2ee-cc9d-4aed-994a-59021ba71f47"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.078054 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fb6dd2ee-cc9d-4aed-994a-59021ba71f47" (UID: "fb6dd2ee-cc9d-4aed-994a-59021ba71f47"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.078352 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-kube-api-access-24njh" (OuterVolumeSpecName: "kube-api-access-24njh") pod "fb6dd2ee-cc9d-4aed-994a-59021ba71f47" (UID: "fb6dd2ee-cc9d-4aed-994a-59021ba71f47"). InnerVolumeSpecName "kube-api-access-24njh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.082938 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-scripts" (OuterVolumeSpecName: "scripts") pod "fb6dd2ee-cc9d-4aed-994a-59021ba71f47" (UID: "fb6dd2ee-cc9d-4aed-994a-59021ba71f47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.100629 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb6dd2ee-cc9d-4aed-994a-59021ba71f47" (UID: "fb6dd2ee-cc9d-4aed-994a-59021ba71f47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.114313 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-config-data" (OuterVolumeSpecName: "config-data") pod "fb6dd2ee-cc9d-4aed-994a-59021ba71f47" (UID: "fb6dd2ee-cc9d-4aed-994a-59021ba71f47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.174224 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.174293 4823 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.174321 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24njh\" (UniqueName: \"kubernetes.io/projected/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-kube-api-access-24njh\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.174348 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.174370 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.174387 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6dd2ee-cc9d-4aed-994a-59021ba71f47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.653527 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tjbg9" event={"ID":"fb6dd2ee-cc9d-4aed-994a-59021ba71f47","Type":"ContainerDied","Data":"52a67e04341c8ae9b9db51409719d2665b49b8cd4927fa566097e07394bde095"} Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.653607 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a67e04341c8ae9b9db51409719d2665b49b8cd4927fa566097e07394bde095" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.653617 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tjbg9" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.746266 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b64c64d55-q7zxm"] Dec 16 08:51:46 crc kubenswrapper[4823]: E1216 08:51:46.746619 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6dd2ee-cc9d-4aed-994a-59021ba71f47" containerName="keystone-bootstrap" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.746635 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6dd2ee-cc9d-4aed-994a-59021ba71f47" containerName="keystone-bootstrap" Dec 16 08:51:46 crc kubenswrapper[4823]: E1216 08:51:46.746650 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerName="init" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.746656 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerName="init" Dec 16 08:51:46 crc kubenswrapper[4823]: E1216 08:51:46.746666 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerName="dnsmasq-dns" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.746674 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerName="dnsmasq-dns" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.746860 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="16894c6b-6fe2-41ae-a89f-67a0c0b3710c" containerName="dnsmasq-dns" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.746882 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6dd2ee-cc9d-4aed-994a-59021ba71f47" containerName="keystone-bootstrap" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.747458 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.750442 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.750460 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.750717 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzvsn" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.750734 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.750866 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.750924 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.779012 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b64c64d55-q7zxm"] Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.887814 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465rw\" (UniqueName: \"kubernetes.io/projected/e5b08afc-bfe3-4938-ac42-3781d1290201-kube-api-access-465rw\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888007 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-public-tls-certs\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888218 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-scripts\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888351 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-combined-ca-bundle\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888439 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-credential-keys\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888469 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-config-data\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888525 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-internal-tls-certs\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.888924 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-fernet-keys\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990444 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-scripts\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990493 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-combined-ca-bundle\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990537 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-credential-keys\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990558 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-config-data\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990586 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-internal-tls-certs\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990610 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-fernet-keys\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990658 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465rw\" (UniqueName: \"kubernetes.io/projected/e5b08afc-bfe3-4938-ac42-3781d1290201-kube-api-access-465rw\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.990679 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-public-tls-certs\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.994500 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-scripts\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.995492 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-public-tls-certs\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.997190 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-config-data\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.997593 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-internal-tls-certs\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:46 crc kubenswrapper[4823]: I1216 08:51:46.997727 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-credential-keys\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:47 crc kubenswrapper[4823]: I1216 08:51:46.998699 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-fernet-keys\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:47 crc kubenswrapper[4823]: I1216 08:51:46.999245 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-combined-ca-bundle\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:47 crc kubenswrapper[4823]: I1216 08:51:47.034733 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465rw\" (UniqueName: \"kubernetes.io/projected/e5b08afc-bfe3-4938-ac42-3781d1290201-kube-api-access-465rw\") pod \"keystone-b64c64d55-q7zxm\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:47 crc kubenswrapper[4823]: I1216 08:51:47.071860 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:47 crc kubenswrapper[4823]: I1216 08:51:47.569276 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b64c64d55-q7zxm"] Dec 16 08:51:47 crc kubenswrapper[4823]: I1216 08:51:47.671099 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b64c64d55-q7zxm" event={"ID":"e5b08afc-bfe3-4938-ac42-3781d1290201","Type":"ContainerStarted","Data":"471946ead711d4cc0705475683dd476bb4290ee837b8b8eb8c47d83311087b06"} Dec 16 08:51:48 crc kubenswrapper[4823]: I1216 08:51:48.683249 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b64c64d55-q7zxm" event={"ID":"e5b08afc-bfe3-4938-ac42-3781d1290201","Type":"ContainerStarted","Data":"5a71791d0d178cb3e2f0ca5f41f8f5be586775d58f1659933d5697c3e1b3e765"} Dec 16 08:51:48 crc kubenswrapper[4823]: I1216 08:51:48.685243 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:51:48 crc kubenswrapper[4823]: I1216 08:51:48.725610 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b64c64d55-q7zxm" podStartSLOduration=2.725574322 podStartE2EDuration="2.725574322s" podCreationTimestamp="2025-12-16 08:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:51:48.710332775 +0000 UTC m=+6987.198898938" watchObservedRunningTime="2025-12-16 08:51:48.725574322 +0000 UTC m=+6987.214140475" Dec 16 08:51:58 crc kubenswrapper[4823]: I1216 08:51:58.133584 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:51:58 crc kubenswrapper[4823]: I1216 08:51:58.134355 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:52:18 crc kubenswrapper[4823]: I1216 08:52:18.838553 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.145998 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.150538 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.156521 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.156683 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.156801 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5wggw" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.162265 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.220878 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.220964 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.221078 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config-secret\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.221118 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbz67\" (UniqueName: \"kubernetes.io/projected/de346601-4f73-4c1f-b1ce-900f0a74e925-kube-api-access-dbz67\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.322416 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbz67\" (UniqueName: \"kubernetes.io/projected/de346601-4f73-4c1f-b1ce-900f0a74e925-kube-api-access-dbz67\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.322525 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.322570 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.322680 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config-secret\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.323625 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.328633 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config-secret\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.329156 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-combined-ca-bundle\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.353410 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbz67\" (UniqueName: \"kubernetes.io/projected/de346601-4f73-4c1f-b1ce-900f0a74e925-kube-api-access-dbz67\") pod \"openstackclient\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.472539 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:52:22 crc kubenswrapper[4823]: I1216 08:52:22.912965 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 08:52:23 crc kubenswrapper[4823]: I1216 08:52:23.015591 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de346601-4f73-4c1f-b1ce-900f0a74e925","Type":"ContainerStarted","Data":"80931f2bd8b372f7f9356062fafb0bb0cb06de601d0b305baeea7d02deddd298"} Dec 16 08:52:28 crc kubenswrapper[4823]: I1216 08:52:28.134452 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:52:28 crc kubenswrapper[4823]: I1216 08:52:28.135387 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:52:34 crc kubenswrapper[4823]: I1216 08:52:34.102091 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de346601-4f73-4c1f-b1ce-900f0a74e925","Type":"ContainerStarted","Data":"ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c"} Dec 16 08:52:34 crc kubenswrapper[4823]: I1216 08:52:34.127055 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.202460925 podStartE2EDuration="12.127016947s" podCreationTimestamp="2025-12-16 08:52:22 +0000 UTC" firstStartedPulling="2025-12-16 08:52:22.91926127 +0000 UTC m=+7021.407827393" lastFinishedPulling="2025-12-16 08:52:33.843817292 +0000 UTC m=+7032.332383415" observedRunningTime="2025-12-16 08:52:34.119003037 +0000 UTC m=+7032.607569160" watchObservedRunningTime="2025-12-16 08:52:34.127016947 +0000 UTC m=+7032.615583070" Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.133691 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.134320 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.134373 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.135207 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.135261 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" gracePeriod=600 Dec 16 08:52:58 crc kubenswrapper[4823]: E1216 08:52:58.256786 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.337142 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" exitCode=0 Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.337183 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428"} Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.337217 4823 scope.go:117] "RemoveContainer" containerID="e7aa677772e57f6515b9bb17d98c3eab3e0272b84b5b66258a7f5432c5f0835b" Dec 16 08:52:58 crc kubenswrapper[4823]: I1216 08:52:58.337813 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:52:58 crc kubenswrapper[4823]: E1216 08:52:58.338094 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:53:09 crc kubenswrapper[4823]: I1216 08:53:09.772328 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:53:09 crc kubenswrapper[4823]: E1216 08:53:09.773311 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:53:21 crc kubenswrapper[4823]: I1216 08:53:21.781595 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:53:21 crc kubenswrapper[4823]: E1216 08:53:21.782505 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:53:33 crc kubenswrapper[4823]: I1216 08:53:33.776862 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:53:33 crc kubenswrapper[4823]: E1216 08:53:33.778001 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:53:45 crc kubenswrapper[4823]: I1216 08:53:45.775353 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:53:45 crc kubenswrapper[4823]: E1216 08:53:45.777113 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:53:56 crc kubenswrapper[4823]: I1216 08:53:56.774801 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:53:56 crc kubenswrapper[4823]: E1216 08:53:56.776801 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.770259 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-crw7r"] Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.771974 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.785473 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-crw7r"] Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.873769 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-75d9-account-create-update-c2qxb"] Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.874973 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.877329 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.883149 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-75d9-account-create-update-c2qxb"] Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.886102 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-operator-scripts\") pod \"barbican-db-create-crw7r\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.886165 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phhs8\" (UniqueName: \"kubernetes.io/projected/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-kube-api-access-phhs8\") pod \"barbican-db-create-crw7r\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.987625 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bdg4\" (UniqueName: \"kubernetes.io/projected/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-kube-api-access-2bdg4\") pod \"barbican-75d9-account-create-update-c2qxb\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.988097 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-operator-scripts\") pod \"barbican-db-create-crw7r\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.988227 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phhs8\" (UniqueName: \"kubernetes.io/projected/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-kube-api-access-phhs8\") pod \"barbican-db-create-crw7r\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.988340 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-operator-scripts\") pod \"barbican-75d9-account-create-update-c2qxb\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:57 crc kubenswrapper[4823]: I1216 08:53:57.988944 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-operator-scripts\") pod \"barbican-db-create-crw7r\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.013949 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phhs8\" (UniqueName: \"kubernetes.io/projected/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-kube-api-access-phhs8\") pod \"barbican-db-create-crw7r\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.089598 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-operator-scripts\") pod \"barbican-75d9-account-create-update-c2qxb\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.089655 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bdg4\" (UniqueName: \"kubernetes.io/projected/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-kube-api-access-2bdg4\") pod \"barbican-75d9-account-create-update-c2qxb\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.089813 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-crw7r" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.090944 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-operator-scripts\") pod \"barbican-75d9-account-create-update-c2qxb\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.124834 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bdg4\" (UniqueName: \"kubernetes.io/projected/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-kube-api-access-2bdg4\") pod \"barbican-75d9-account-create-update-c2qxb\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.196084 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.642304 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-crw7r"] Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.689935 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-75d9-account-create-update-c2qxb"] Dec 16 08:53:58 crc kubenswrapper[4823]: W1216 08:53:58.698607 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e16ede9_a9c3_45f5_a5ff_beca730a92ff.slice/crio-cacc56a6a9fc6e81afd361f2bf4e3fa87af04c9f488081e9fc6894be5aa029d1 WatchSource:0}: Error finding container cacc56a6a9fc6e81afd361f2bf4e3fa87af04c9f488081e9fc6894be5aa029d1: Status 404 returned error can't find the container with id cacc56a6a9fc6e81afd361f2bf4e3fa87af04c9f488081e9fc6894be5aa029d1 Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.955803 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-crw7r" event={"ID":"37f2e0a9-8049-4cac-855c-7da22cf8c4fe","Type":"ContainerStarted","Data":"eb1881f55087cc05d1c5fc74b1ae9ab6d50cdac028c0ed81d2e80a4cb91eed7a"} Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.955851 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-crw7r" event={"ID":"37f2e0a9-8049-4cac-855c-7da22cf8c4fe","Type":"ContainerStarted","Data":"07397ae087a4554549261e0b9f6c8745a4e13a4368b9edc795d960c9df1b5862"} Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.958931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75d9-account-create-update-c2qxb" event={"ID":"1e16ede9-a9c3-45f5-a5ff-beca730a92ff","Type":"ContainerStarted","Data":"03172229973ebfa1778a194ddde03e0e3348d5dba5581e8e85812005e9dbde8a"} Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.958960 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75d9-account-create-update-c2qxb" event={"ID":"1e16ede9-a9c3-45f5-a5ff-beca730a92ff","Type":"ContainerStarted","Data":"cacc56a6a9fc6e81afd361f2bf4e3fa87af04c9f488081e9fc6894be5aa029d1"} Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.977137 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-crw7r" podStartSLOduration=1.9771089819999998 podStartE2EDuration="1.977108982s" podCreationTimestamp="2025-12-16 08:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:53:58.969088601 +0000 UTC m=+7117.457654734" watchObservedRunningTime="2025-12-16 08:53:58.977108982 +0000 UTC m=+7117.465675115" Dec 16 08:53:58 crc kubenswrapper[4823]: I1216 08:53:58.988863 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-75d9-account-create-update-c2qxb" podStartSLOduration=1.98884397 podStartE2EDuration="1.98884397s" podCreationTimestamp="2025-12-16 08:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:53:58.980987054 +0000 UTC m=+7117.469553187" watchObservedRunningTime="2025-12-16 08:53:58.98884397 +0000 UTC m=+7117.477410103" Dec 16 08:53:59 crc kubenswrapper[4823]: I1216 08:53:59.972551 4823 generic.go:334] "Generic (PLEG): container finished" podID="37f2e0a9-8049-4cac-855c-7da22cf8c4fe" containerID="eb1881f55087cc05d1c5fc74b1ae9ab6d50cdac028c0ed81d2e80a4cb91eed7a" exitCode=0 Dec 16 08:53:59 crc kubenswrapper[4823]: I1216 08:53:59.973018 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-crw7r" event={"ID":"37f2e0a9-8049-4cac-855c-7da22cf8c4fe","Type":"ContainerDied","Data":"eb1881f55087cc05d1c5fc74b1ae9ab6d50cdac028c0ed81d2e80a4cb91eed7a"} Dec 16 08:53:59 crc kubenswrapper[4823]: I1216 08:53:59.975532 4823 generic.go:334] "Generic (PLEG): container finished" podID="1e16ede9-a9c3-45f5-a5ff-beca730a92ff" containerID="03172229973ebfa1778a194ddde03e0e3348d5dba5581e8e85812005e9dbde8a" exitCode=0 Dec 16 08:53:59 crc kubenswrapper[4823]: I1216 08:53:59.975673 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75d9-account-create-update-c2qxb" event={"ID":"1e16ede9-a9c3-45f5-a5ff-beca730a92ff","Type":"ContainerDied","Data":"03172229973ebfa1778a194ddde03e0e3348d5dba5581e8e85812005e9dbde8a"} Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.399685 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-crw7r" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.409696 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.555264 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-operator-scripts\") pod \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.555340 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bdg4\" (UniqueName: \"kubernetes.io/projected/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-kube-api-access-2bdg4\") pod \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\" (UID: \"1e16ede9-a9c3-45f5-a5ff-beca730a92ff\") " Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.555466 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-operator-scripts\") pod \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.555520 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phhs8\" (UniqueName: \"kubernetes.io/projected/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-kube-api-access-phhs8\") pod \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\" (UID: \"37f2e0a9-8049-4cac-855c-7da22cf8c4fe\") " Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.555854 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e16ede9-a9c3-45f5-a5ff-beca730a92ff" (UID: "1e16ede9-a9c3-45f5-a5ff-beca730a92ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.556186 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.556552 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37f2e0a9-8049-4cac-855c-7da22cf8c4fe" (UID: "37f2e0a9-8049-4cac-855c-7da22cf8c4fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.564331 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-kube-api-access-2bdg4" (OuterVolumeSpecName: "kube-api-access-2bdg4") pod "1e16ede9-a9c3-45f5-a5ff-beca730a92ff" (UID: "1e16ede9-a9c3-45f5-a5ff-beca730a92ff"). InnerVolumeSpecName "kube-api-access-2bdg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.564384 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-kube-api-access-phhs8" (OuterVolumeSpecName: "kube-api-access-phhs8") pod "37f2e0a9-8049-4cac-855c-7da22cf8c4fe" (UID: "37f2e0a9-8049-4cac-855c-7da22cf8c4fe"). InnerVolumeSpecName "kube-api-access-phhs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.658250 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bdg4\" (UniqueName: \"kubernetes.io/projected/1e16ede9-a9c3-45f5-a5ff-beca730a92ff-kube-api-access-2bdg4\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.658590 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.658602 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phhs8\" (UniqueName: \"kubernetes.io/projected/37f2e0a9-8049-4cac-855c-7da22cf8c4fe-kube-api-access-phhs8\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.989489 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-crw7r" event={"ID":"37f2e0a9-8049-4cac-855c-7da22cf8c4fe","Type":"ContainerDied","Data":"07397ae087a4554549261e0b9f6c8745a4e13a4368b9edc795d960c9df1b5862"} Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.989516 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-crw7r" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.989529 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07397ae087a4554549261e0b9f6c8745a4e13a4368b9edc795d960c9df1b5862" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.996910 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75d9-account-create-update-c2qxb" event={"ID":"1e16ede9-a9c3-45f5-a5ff-beca730a92ff","Type":"ContainerDied","Data":"cacc56a6a9fc6e81afd361f2bf4e3fa87af04c9f488081e9fc6894be5aa029d1"} Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.996984 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cacc56a6a9fc6e81afd361f2bf4e3fa87af04c9f488081e9fc6894be5aa029d1" Dec 16 08:54:01 crc kubenswrapper[4823]: I1216 08:54:01.996981 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75d9-account-create-update-c2qxb" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.158548 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mcsnt"] Dec 16 08:54:03 crc kubenswrapper[4823]: E1216 08:54:03.158959 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e16ede9-a9c3-45f5-a5ff-beca730a92ff" containerName="mariadb-account-create-update" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.158975 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e16ede9-a9c3-45f5-a5ff-beca730a92ff" containerName="mariadb-account-create-update" Dec 16 08:54:03 crc kubenswrapper[4823]: E1216 08:54:03.158997 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f2e0a9-8049-4cac-855c-7da22cf8c4fe" containerName="mariadb-database-create" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.159006 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f2e0a9-8049-4cac-855c-7da22cf8c4fe" containerName="mariadb-database-create" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.159231 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e16ede9-a9c3-45f5-a5ff-beca730a92ff" containerName="mariadb-account-create-update" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.159284 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f2e0a9-8049-4cac-855c-7da22cf8c4fe" containerName="mariadb-database-create" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.159904 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.167333 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.167679 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8nlsx" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.186705 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mcsnt"] Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.286702 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-db-sync-config-data\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.287116 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-combined-ca-bundle\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.287350 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrz9\" (UniqueName: \"kubernetes.io/projected/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-kube-api-access-xbrz9\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.389044 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-combined-ca-bundle\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.389112 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrz9\" (UniqueName: \"kubernetes.io/projected/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-kube-api-access-xbrz9\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.389141 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-db-sync-config-data\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.395061 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-db-sync-config-data\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.395382 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-combined-ca-bundle\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.416055 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrz9\" (UniqueName: \"kubernetes.io/projected/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-kube-api-access-xbrz9\") pod \"barbican-db-sync-mcsnt\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.482129 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:03 crc kubenswrapper[4823]: I1216 08:54:03.947154 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mcsnt"] Dec 16 08:54:04 crc kubenswrapper[4823]: I1216 08:54:04.013886 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mcsnt" event={"ID":"a25b8be3-cf0e-4682-b7a2-b56f101a23e4","Type":"ContainerStarted","Data":"1af9b35fed6f7ab5b389b88538b97de69ffa94b520214b92eec0dd11f6644b7f"} Dec 16 08:54:09 crc kubenswrapper[4823]: I1216 08:54:09.069765 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mcsnt" event={"ID":"a25b8be3-cf0e-4682-b7a2-b56f101a23e4","Type":"ContainerStarted","Data":"2f33ecc1c3af33c27544999c9f3531ef568874dfdbae7c32fb60eec269e16f5a"} Dec 16 08:54:09 crc kubenswrapper[4823]: I1216 08:54:09.087356 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mcsnt" podStartSLOduration=1.394904827 podStartE2EDuration="6.087338902s" podCreationTimestamp="2025-12-16 08:54:03 +0000 UTC" firstStartedPulling="2025-12-16 08:54:03.950107742 +0000 UTC m=+7122.438673875" lastFinishedPulling="2025-12-16 08:54:08.642541827 +0000 UTC m=+7127.131107950" observedRunningTime="2025-12-16 08:54:09.086376312 +0000 UTC m=+7127.574942445" watchObservedRunningTime="2025-12-16 08:54:09.087338902 +0000 UTC m=+7127.575905035" Dec 16 08:54:10 crc kubenswrapper[4823]: I1216 08:54:10.771541 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:54:10 crc kubenswrapper[4823]: E1216 08:54:10.772053 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:54:11 crc kubenswrapper[4823]: I1216 08:54:11.088085 4823 generic.go:334] "Generic (PLEG): container finished" podID="a25b8be3-cf0e-4682-b7a2-b56f101a23e4" containerID="2f33ecc1c3af33c27544999c9f3531ef568874dfdbae7c32fb60eec269e16f5a" exitCode=0 Dec 16 08:54:11 crc kubenswrapper[4823]: I1216 08:54:11.088128 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mcsnt" event={"ID":"a25b8be3-cf0e-4682-b7a2-b56f101a23e4","Type":"ContainerDied","Data":"2f33ecc1c3af33c27544999c9f3531ef568874dfdbae7c32fb60eec269e16f5a"} Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.540370 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.563670 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrz9\" (UniqueName: \"kubernetes.io/projected/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-kube-api-access-xbrz9\") pod \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.563834 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-combined-ca-bundle\") pod \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.563885 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-db-sync-config-data\") pod \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\" (UID: \"a25b8be3-cf0e-4682-b7a2-b56f101a23e4\") " Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.569381 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a25b8be3-cf0e-4682-b7a2-b56f101a23e4" (UID: "a25b8be3-cf0e-4682-b7a2-b56f101a23e4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.569561 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-kube-api-access-xbrz9" (OuterVolumeSpecName: "kube-api-access-xbrz9") pod "a25b8be3-cf0e-4682-b7a2-b56f101a23e4" (UID: "a25b8be3-cf0e-4682-b7a2-b56f101a23e4"). InnerVolumeSpecName "kube-api-access-xbrz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.606618 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a25b8be3-cf0e-4682-b7a2-b56f101a23e4" (UID: "a25b8be3-cf0e-4682-b7a2-b56f101a23e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.666456 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrz9\" (UniqueName: \"kubernetes.io/projected/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-kube-api-access-xbrz9\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.666725 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:12 crc kubenswrapper[4823]: I1216 08:54:12.666805 4823 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a25b8be3-cf0e-4682-b7a2-b56f101a23e4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.112068 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mcsnt" event={"ID":"a25b8be3-cf0e-4682-b7a2-b56f101a23e4","Type":"ContainerDied","Data":"1af9b35fed6f7ab5b389b88538b97de69ffa94b520214b92eec0dd11f6644b7f"} Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.112129 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af9b35fed6f7ab5b389b88538b97de69ffa94b520214b92eec0dd11f6644b7f" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.112603 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mcsnt" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.366755 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-fcf4dff7-84zz6"] Dec 16 08:54:13 crc kubenswrapper[4823]: E1216 08:54:13.367201 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25b8be3-cf0e-4682-b7a2-b56f101a23e4" containerName="barbican-db-sync" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.367224 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25b8be3-cf0e-4682-b7a2-b56f101a23e4" containerName="barbican-db-sync" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.367439 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25b8be3-cf0e-4682-b7a2-b56f101a23e4" containerName="barbican-db-sync" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.368943 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.372056 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.372745 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.372750 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8nlsx" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.381971 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data-custom\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.382156 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341f00a5-410a-4656-876e-a6b0cfe2a4df-logs\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.382207 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjlq\" (UniqueName: \"kubernetes.io/projected/341f00a5-410a-4656-876e-a6b0cfe2a4df-kube-api-access-ngjlq\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.382313 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.382347 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-combined-ca-bundle\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.446102 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-865d4cf8d6-bwj5n"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.447922 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.453308 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.460060 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f88b6949f-mhwpm"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.461455 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.474085 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fcf4dff7-84zz6"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484198 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data-custom\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484303 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data-custom\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484346 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfk7d\" (UniqueName: \"kubernetes.io/projected/7a50033a-9a6e-42e3-ac23-de2a24654b0f-kube-api-access-nfk7d\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484377 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341f00a5-410a-4656-876e-a6b0cfe2a4df-logs\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484410 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484444 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjlq\" (UniqueName: \"kubernetes.io/projected/341f00a5-410a-4656-876e-a6b0cfe2a4df-kube-api-access-ngjlq\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484488 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2dw\" (UniqueName: \"kubernetes.io/projected/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-kube-api-access-4t2dw\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484529 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a50033a-9a6e-42e3-ac23-de2a24654b0f-logs\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484778 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-combined-ca-bundle\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484807 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484837 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-combined-ca-bundle\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484889 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-config\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484925 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484953 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-dns-svc\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.484977 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-nb\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.486894 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341f00a5-410a-4656-876e-a6b0cfe2a4df-logs\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.489038 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f88b6949f-mhwpm"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.503007 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data-custom\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.506044 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-865d4cf8d6-bwj5n"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.516133 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjlq\" (UniqueName: \"kubernetes.io/projected/341f00a5-410a-4656-876e-a6b0cfe2a4df-kube-api-access-ngjlq\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.533443 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-combined-ca-bundle\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.535601 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data\") pod \"barbican-worker-fcf4dff7-84zz6\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591015 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2dw\" (UniqueName: \"kubernetes.io/projected/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-kube-api-access-4t2dw\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591540 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a50033a-9a6e-42e3-ac23-de2a24654b0f-logs\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591610 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-combined-ca-bundle\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591672 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-config\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591708 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591736 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-dns-svc\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-nb\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591835 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data-custom\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591867 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfk7d\" (UniqueName: \"kubernetes.io/projected/7a50033a-9a6e-42e3-ac23-de2a24654b0f-kube-api-access-nfk7d\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.591894 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.592962 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.593624 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a50033a-9a6e-42e3-ac23-de2a24654b0f-logs\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.593982 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-dns-svc\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.594352 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-config\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.595215 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-nb\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.598654 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.615263 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-combined-ca-bundle\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.630244 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data-custom\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.636780 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2dw\" (UniqueName: \"kubernetes.io/projected/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-kube-api-access-4t2dw\") pod \"dnsmasq-dns-7f88b6949f-mhwpm\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.637336 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfk7d\" (UniqueName: \"kubernetes.io/projected/7a50033a-9a6e-42e3-ac23-de2a24654b0f-kube-api-access-nfk7d\") pod \"barbican-keystone-listener-865d4cf8d6-bwj5n\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.645685 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68b49df968-2m2nd"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.646997 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.651350 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.693008 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.693141 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-combined-ca-bundle\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.693177 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xnlq\" (UniqueName: \"kubernetes.io/projected/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-kube-api-access-5xnlq\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.693221 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-logs\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.693290 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data-custom\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.695525 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.706357 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68b49df968-2m2nd"] Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.794881 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.794952 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-combined-ca-bundle\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.794977 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xnlq\" (UniqueName: \"kubernetes.io/projected/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-kube-api-access-5xnlq\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.795002 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-logs\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.795121 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data-custom\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.797918 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-logs\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.806571 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data-custom\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.806716 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-combined-ca-bundle\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.807725 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.815975 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xnlq\" (UniqueName: \"kubernetes.io/projected/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-kube-api-access-5xnlq\") pod \"barbican-api-68b49df968-2m2nd\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.873651 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 08:54:13 crc kubenswrapper[4823]: I1216 08:54:13.885320 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:14 crc kubenswrapper[4823]: I1216 08:54:14.014619 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:14 crc kubenswrapper[4823]: I1216 08:54:14.180599 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fcf4dff7-84zz6"] Dec 16 08:54:14 crc kubenswrapper[4823]: I1216 08:54:14.364359 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-865d4cf8d6-bwj5n"] Dec 16 08:54:14 crc kubenswrapper[4823]: W1216 08:54:14.372483 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a50033a_9a6e_42e3_ac23_de2a24654b0f.slice/crio-ad028104b191f0524983431165c5f6dc3558f2981b9dea3f4cdb00404867668b WatchSource:0}: Error finding container ad028104b191f0524983431165c5f6dc3558f2981b9dea3f4cdb00404867668b: Status 404 returned error can't find the container with id ad028104b191f0524983431165c5f6dc3558f2981b9dea3f4cdb00404867668b Dec 16 08:54:14 crc kubenswrapper[4823]: I1216 08:54:14.495381 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f88b6949f-mhwpm"] Dec 16 08:54:14 crc kubenswrapper[4823]: W1216 08:54:14.511185 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a97ec9a_cfd5_4824_bf42_5c29cafafb3d.slice/crio-01cb67d3598f41b1f0077f6e6ffddb26f86238c701930e2329f94ebbeb9fd627 WatchSource:0}: Error finding container 01cb67d3598f41b1f0077f6e6ffddb26f86238c701930e2329f94ebbeb9fd627: Status 404 returned error can't find the container with id 01cb67d3598f41b1f0077f6e6ffddb26f86238c701930e2329f94ebbeb9fd627 Dec 16 08:54:14 crc kubenswrapper[4823]: I1216 08:54:14.538390 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68b49df968-2m2nd"] Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.520741 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68b49df968-2m2nd" event={"ID":"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e","Type":"ContainerStarted","Data":"d0542466ab764e603a2bc04019d29da772ba052dd657feac88175b621290e70c"} Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.521584 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68b49df968-2m2nd" event={"ID":"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e","Type":"ContainerStarted","Data":"2b9d3518202d07e3d786dc06eac4f3a92416086494068fec84b7b326fc0cbb70"} Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.547664 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" event={"ID":"7a50033a-9a6e-42e3-ac23-de2a24654b0f","Type":"ContainerStarted","Data":"ad028104b191f0524983431165c5f6dc3558f2981b9dea3f4cdb00404867668b"} Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.598293 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fcf4dff7-84zz6" event={"ID":"341f00a5-410a-4656-876e-a6b0cfe2a4df","Type":"ContainerStarted","Data":"953044d58368dd6dfc4f7bd3dae87ac1eba4376c14f5723e22ec576d8c8737ea"} Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.638615 4823 generic.go:334] "Generic (PLEG): container finished" podID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerID="65d5616a1a18f1328b5785f9a717e1f7ae833ab7c8e50472712c115dba352311" exitCode=0 Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.638670 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" event={"ID":"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d","Type":"ContainerDied","Data":"65d5616a1a18f1328b5785f9a717e1f7ae833ab7c8e50472712c115dba352311"} Dec 16 08:54:16 crc kubenswrapper[4823]: I1216 08:54:16.638713 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" event={"ID":"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d","Type":"ContainerStarted","Data":"01cb67d3598f41b1f0077f6e6ffddb26f86238c701930e2329f94ebbeb9fd627"} Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.162350 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d656d958d-tmzmp"] Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.164257 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: W1216 08:54:17.165903 4823 reflector.go:561] object-"openstack"/"cert-barbican-internal-svc": failed to list *v1.Secret: secrets "cert-barbican-internal-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 08:54:17 crc kubenswrapper[4823]: E1216 08:54:17.165977 4823 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-barbican-internal-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-barbican-internal-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 08:54:17 crc kubenswrapper[4823]: W1216 08:54:17.166666 4823 reflector.go:561] object-"openstack"/"cert-barbican-public-svc": failed to list *v1.Secret: secrets "cert-barbican-public-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 08:54:17 crc kubenswrapper[4823]: E1216 08:54:17.166706 4823 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-barbican-public-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-barbican-public-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.181514 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d656d958d-tmzmp"] Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288311 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288365 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96vcg\" (UniqueName: \"kubernetes.io/projected/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-kube-api-access-96vcg\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288394 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-internal-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288612 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288686 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-logs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288773 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data-custom\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.288820 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-combined-ca-bundle\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391132 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391204 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96vcg\" (UniqueName: \"kubernetes.io/projected/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-kube-api-access-96vcg\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391232 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-internal-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391307 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391343 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-logs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391394 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data-custom\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.391429 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-combined-ca-bundle\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.392013 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-logs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.398075 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-combined-ca-bundle\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.398564 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data-custom\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.401123 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.410688 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96vcg\" (UniqueName: \"kubernetes.io/projected/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-kube-api-access-96vcg\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.647471 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68b49df968-2m2nd" event={"ID":"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e","Type":"ContainerStarted","Data":"f8237626601dd2226317e496f0f9be869caf64243425acadebad454708d3b9c0"} Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.648202 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.984639 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 16 08:54:17 crc kubenswrapper[4823]: I1216 08:54:17.999120 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-internal-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:18 crc kubenswrapper[4823]: E1216 08:54:18.392088 4823 secret.go:188] Couldn't get secret openstack/cert-barbican-public-svc: failed to sync secret cache: timed out waiting for the condition Dec 16 08:54:18 crc kubenswrapper[4823]: E1216 08:54:18.392798 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs podName:0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec nodeName:}" failed. No retries permitted until 2025-12-16 08:54:18.892744087 +0000 UTC m=+7137.381310210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs") pod "barbican-api-d656d958d-tmzmp" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec") : failed to sync secret cache: timed out waiting for the condition Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.639661 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.656165 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" event={"ID":"7a50033a-9a6e-42e3-ac23-de2a24654b0f","Type":"ContainerStarted","Data":"721ddb3d721e21d50c0be2952ef296f0188553dcfea31e2f3a2d25c394c2d3b6"} Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.656212 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" event={"ID":"7a50033a-9a6e-42e3-ac23-de2a24654b0f","Type":"ContainerStarted","Data":"6153e939f78f8727294cd18b744eafe21fd2960278b20c36c46e082697f211e2"} Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.660112 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fcf4dff7-84zz6" event={"ID":"341f00a5-410a-4656-876e-a6b0cfe2a4df","Type":"ContainerStarted","Data":"43f2f25511680e01631c9fea0525d1784511e8f0fbc8bdc295206a3b91483591"} Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.660167 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fcf4dff7-84zz6" event={"ID":"341f00a5-410a-4656-876e-a6b0cfe2a4df","Type":"ContainerStarted","Data":"a872bdca1a55985b613b2e9d1e9a92fa37fb0ac195eba280c9f758c50de98937"} Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.662313 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" event={"ID":"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d","Type":"ContainerStarted","Data":"279887000a49b31ce5520a85a264f1dff04c4dbdf1c1ddc32ea6391952cfbe69"} Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.662494 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.662623 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.673590 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68b49df968-2m2nd" podStartSLOduration=5.673574578 podStartE2EDuration="5.673574578s" podCreationTimestamp="2025-12-16 08:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:54:17.670414055 +0000 UTC m=+7136.158980168" watchObservedRunningTime="2025-12-16 08:54:18.673574578 +0000 UTC m=+7137.162140701" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.673834 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" podStartSLOduration=2.181806289 podStartE2EDuration="5.673827916s" podCreationTimestamp="2025-12-16 08:54:13 +0000 UTC" firstStartedPulling="2025-12-16 08:54:14.375419985 +0000 UTC m=+7132.863986108" lastFinishedPulling="2025-12-16 08:54:17.867441612 +0000 UTC m=+7136.356007735" observedRunningTime="2025-12-16 08:54:18.673212056 +0000 UTC m=+7137.161778189" watchObservedRunningTime="2025-12-16 08:54:18.673827916 +0000 UTC m=+7137.162394039" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.701180 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" podStartSLOduration=5.701160082 podStartE2EDuration="5.701160082s" podCreationTimestamp="2025-12-16 08:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:54:18.695216386 +0000 UTC m=+7137.183782519" watchObservedRunningTime="2025-12-16 08:54:18.701160082 +0000 UTC m=+7137.189726205" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.734523 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-fcf4dff7-84zz6" podStartSLOduration=2.073765936 podStartE2EDuration="5.734502096s" podCreationTimestamp="2025-12-16 08:54:13 +0000 UTC" firstStartedPulling="2025-12-16 08:54:14.193998205 +0000 UTC m=+7132.682564328" lastFinishedPulling="2025-12-16 08:54:17.854734365 +0000 UTC m=+7136.343300488" observedRunningTime="2025-12-16 08:54:18.726541846 +0000 UTC m=+7137.215107969" watchObservedRunningTime="2025-12-16 08:54:18.734502096 +0000 UTC m=+7137.223068229" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.918601 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.934180 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs\") pod \"barbican-api-d656d958d-tmzmp\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:18 crc kubenswrapper[4823]: I1216 08:54:18.986088 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:19 crc kubenswrapper[4823]: I1216 08:54:19.666462 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d656d958d-tmzmp"] Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.700826 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d656d958d-tmzmp" event={"ID":"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec","Type":"ContainerStarted","Data":"9a306cbeecf35df7308d1553cc064c30c8abbe4a6a369ff751b3831a552d0f27"} Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.701286 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d656d958d-tmzmp" event={"ID":"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec","Type":"ContainerStarted","Data":"c4f509080a9f88ea8968e728a43f48daa0e137e74d01c40c44bdd031bebe8a40"} Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.701297 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d656d958d-tmzmp" event={"ID":"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec","Type":"ContainerStarted","Data":"a0d0509f348f194d87474c81cc58092e35c1372cd79eb70d60ee75b795c75b95"} Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.701327 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.701346 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.733532 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d656d958d-tmzmp" podStartSLOduration=3.731621505 podStartE2EDuration="3.731621505s" podCreationTimestamp="2025-12-16 08:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:54:20.72411295 +0000 UTC m=+7139.212679093" watchObservedRunningTime="2025-12-16 08:54:20.731621505 +0000 UTC m=+7139.220187628" Dec 16 08:54:20 crc kubenswrapper[4823]: I1216 08:54:20.767911 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:23 crc kubenswrapper[4823]: I1216 08:54:23.771600 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:54:23 crc kubenswrapper[4823]: E1216 08:54:23.772315 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:54:23 crc kubenswrapper[4823]: I1216 08:54:23.893432 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:54:23 crc kubenswrapper[4823]: I1216 08:54:23.959336 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7855b89597-4wv7t"] Dec 16 08:54:23 crc kubenswrapper[4823]: I1216 08:54:23.959616 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" containerName="dnsmasq-dns" containerID="cri-o://29b42b9afc72e4c0bd5c3085cbb2f6431cb628d9a171429a96e9c632452b8f65" gracePeriod=10 Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.776577 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a8c54db-60c1-4615-855d-33153bc4970d" containerID="29b42b9afc72e4c0bd5c3085cbb2f6431cb628d9a171429a96e9c632452b8f65" exitCode=0 Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.776613 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" event={"ID":"2a8c54db-60c1-4615-855d-33153bc4970d","Type":"ContainerDied","Data":"29b42b9afc72e4c0bd5c3085cbb2f6431cb628d9a171429a96e9c632452b8f65"} Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.947800 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.993018 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl5tr\" (UniqueName: \"kubernetes.io/projected/2a8c54db-60c1-4615-855d-33153bc4970d-kube-api-access-vl5tr\") pod \"2a8c54db-60c1-4615-855d-33153bc4970d\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.993115 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-sb\") pod \"2a8c54db-60c1-4615-855d-33153bc4970d\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.993185 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-nb\") pod \"2a8c54db-60c1-4615-855d-33153bc4970d\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.993225 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-dns-svc\") pod \"2a8c54db-60c1-4615-855d-33153bc4970d\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " Dec 16 08:54:24 crc kubenswrapper[4823]: I1216 08:54:24.993246 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-config\") pod \"2a8c54db-60c1-4615-855d-33153bc4970d\" (UID: \"2a8c54db-60c1-4615-855d-33153bc4970d\") " Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.013048 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8c54db-60c1-4615-855d-33153bc4970d-kube-api-access-vl5tr" (OuterVolumeSpecName: "kube-api-access-vl5tr") pod "2a8c54db-60c1-4615-855d-33153bc4970d" (UID: "2a8c54db-60c1-4615-855d-33153bc4970d"). InnerVolumeSpecName "kube-api-access-vl5tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.040468 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a8c54db-60c1-4615-855d-33153bc4970d" (UID: "2a8c54db-60c1-4615-855d-33153bc4970d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.046413 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a8c54db-60c1-4615-855d-33153bc4970d" (UID: "2a8c54db-60c1-4615-855d-33153bc4970d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.052732 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-config" (OuterVolumeSpecName: "config") pod "2a8c54db-60c1-4615-855d-33153bc4970d" (UID: "2a8c54db-60c1-4615-855d-33153bc4970d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.070728 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a8c54db-60c1-4615-855d-33153bc4970d" (UID: "2a8c54db-60c1-4615-855d-33153bc4970d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.095236 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl5tr\" (UniqueName: \"kubernetes.io/projected/2a8c54db-60c1-4615-855d-33153bc4970d-kube-api-access-vl5tr\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.095273 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.095286 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.095298 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.095308 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c54db-60c1-4615-855d-33153bc4970d-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.758439 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.800810 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" event={"ID":"2a8c54db-60c1-4615-855d-33153bc4970d","Type":"ContainerDied","Data":"971237a37a276a8fac544db84ad4632992ebcc0d8a26d2189c5ca3ec044a4753"} Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.800883 4823 scope.go:117] "RemoveContainer" containerID="29b42b9afc72e4c0bd5c3085cbb2f6431cb628d9a171429a96e9c632452b8f65" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.800944 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7855b89597-4wv7t" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.839518 4823 scope.go:117] "RemoveContainer" containerID="1b60411fa736044173574aac51a7401327f9d71fdb73fa6ddff846144843b50a" Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.913587 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7855b89597-4wv7t"] Dec 16 08:54:25 crc kubenswrapper[4823]: I1216 08:54:25.919082 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7855b89597-4wv7t"] Dec 16 08:54:27 crc kubenswrapper[4823]: I1216 08:54:27.787614 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" path="/var/lib/kubelet/pods/2a8c54db-60c1-4615-855d-33153bc4970d/volumes" Dec 16 08:54:30 crc kubenswrapper[4823]: I1216 08:54:30.361877 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:30 crc kubenswrapper[4823]: I1216 08:54:30.464560 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 08:54:30 crc kubenswrapper[4823]: I1216 08:54:30.557198 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68b49df968-2m2nd"] Dec 16 08:54:30 crc kubenswrapper[4823]: I1216 08:54:30.557506 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68b49df968-2m2nd" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api-log" containerID="cri-o://d0542466ab764e603a2bc04019d29da772ba052dd657feac88175b621290e70c" gracePeriod=30 Dec 16 08:54:30 crc kubenswrapper[4823]: I1216 08:54:30.557986 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68b49df968-2m2nd" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api" containerID="cri-o://f8237626601dd2226317e496f0f9be869caf64243425acadebad454708d3b9c0" gracePeriod=30 Dec 16 08:54:31 crc kubenswrapper[4823]: I1216 08:54:31.851756 4823 generic.go:334] "Generic (PLEG): container finished" podID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerID="d0542466ab764e603a2bc04019d29da772ba052dd657feac88175b621290e70c" exitCode=143 Dec 16 08:54:31 crc kubenswrapper[4823]: I1216 08:54:31.851791 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68b49df968-2m2nd" event={"ID":"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e","Type":"ContainerDied","Data":"d0542466ab764e603a2bc04019d29da772ba052dd657feac88175b621290e70c"} Dec 16 08:54:33 crc kubenswrapper[4823]: I1216 08:54:33.889050 4823 generic.go:334] "Generic (PLEG): container finished" podID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerID="f8237626601dd2226317e496f0f9be869caf64243425acadebad454708d3b9c0" exitCode=0 Dec 16 08:54:33 crc kubenswrapper[4823]: I1216 08:54:33.889100 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68b49df968-2m2nd" event={"ID":"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e","Type":"ContainerDied","Data":"f8237626601dd2226317e496f0f9be869caf64243425acadebad454708d3b9c0"} Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.199072 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.385509 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xnlq\" (UniqueName: \"kubernetes.io/projected/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-kube-api-access-5xnlq\") pod \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.385665 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data-custom\") pod \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.385755 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-combined-ca-bundle\") pod \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.387047 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-logs\") pod \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.387231 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data\") pod \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\" (UID: \"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e\") " Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.387624 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-logs" (OuterVolumeSpecName: "logs") pod "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" (UID: "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.388072 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.393602 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" (UID: "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.398332 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-kube-api-access-5xnlq" (OuterVolumeSpecName: "kube-api-access-5xnlq") pod "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" (UID: "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e"). InnerVolumeSpecName "kube-api-access-5xnlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.431225 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" (UID: "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.466381 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data" (OuterVolumeSpecName: "config-data") pod "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" (UID: "a5aaf506-cd69-4dd2-b380-e10b2a9ced6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.490670 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xnlq\" (UniqueName: \"kubernetes.io/projected/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-kube-api-access-5xnlq\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.490757 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.490783 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.490809 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.902771 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68b49df968-2m2nd" event={"ID":"a5aaf506-cd69-4dd2-b380-e10b2a9ced6e","Type":"ContainerDied","Data":"2b9d3518202d07e3d786dc06eac4f3a92416086494068fec84b7b326fc0cbb70"} Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.902920 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68b49df968-2m2nd" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.902945 4823 scope.go:117] "RemoveContainer" containerID="f8237626601dd2226317e496f0f9be869caf64243425acadebad454708d3b9c0" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.932106 4823 scope.go:117] "RemoveContainer" containerID="d0542466ab764e603a2bc04019d29da772ba052dd657feac88175b621290e70c" Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.961166 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68b49df968-2m2nd"] Dec 16 08:54:34 crc kubenswrapper[4823]: I1216 08:54:34.973567 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-68b49df968-2m2nd"] Dec 16 08:54:35 crc kubenswrapper[4823]: I1216 08:54:35.793121 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" path="/var/lib/kubelet/pods/a5aaf506-cd69-4dd2-b380-e10b2a9ced6e/volumes" Dec 16 08:54:38 crc kubenswrapper[4823]: I1216 08:54:38.771205 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:54:38 crc kubenswrapper[4823]: E1216 08:54:38.772183 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:54:39 crc kubenswrapper[4823]: I1216 08:54:39.016563 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68b49df968-2m2nd" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.39:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:54:39 crc kubenswrapper[4823]: I1216 08:54:39.016592 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68b49df968-2m2nd" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.39:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.450757 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6pxbt"] Dec 16 08:54:47 crc kubenswrapper[4823]: E1216 08:54:47.451777 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.451797 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api" Dec 16 08:54:47 crc kubenswrapper[4823]: E1216 08:54:47.451830 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api-log" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.451838 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api-log" Dec 16 08:54:47 crc kubenswrapper[4823]: E1216 08:54:47.451848 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" containerName="dnsmasq-dns" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.451856 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" containerName="dnsmasq-dns" Dec 16 08:54:47 crc kubenswrapper[4823]: E1216 08:54:47.451872 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" containerName="init" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.451879 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" containerName="init" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.452119 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8c54db-60c1-4615-855d-33153bc4970d" containerName="dnsmasq-dns" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.452136 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.452155 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5aaf506-cd69-4dd2-b380-e10b2a9ced6e" containerName="barbican-api-log" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.452804 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.459186 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6pxbt"] Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.559627 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c394-account-create-update-2chx5"] Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.560798 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.562745 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.563179 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ee938f-c401-44b5-aaa8-40654d2217ea-operator-scripts\") pod \"neutron-db-create-6pxbt\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.563223 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gxvf\" (UniqueName: \"kubernetes.io/projected/b2ee938f-c401-44b5-aaa8-40654d2217ea-kube-api-access-5gxvf\") pod \"neutron-db-create-6pxbt\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.581300 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c394-account-create-update-2chx5"] Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.664641 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72f2\" (UniqueName: \"kubernetes.io/projected/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-kube-api-access-n72f2\") pod \"neutron-c394-account-create-update-2chx5\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.664768 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ee938f-c401-44b5-aaa8-40654d2217ea-operator-scripts\") pod \"neutron-db-create-6pxbt\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.664807 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gxvf\" (UniqueName: \"kubernetes.io/projected/b2ee938f-c401-44b5-aaa8-40654d2217ea-kube-api-access-5gxvf\") pod \"neutron-db-create-6pxbt\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.664882 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-operator-scripts\") pod \"neutron-c394-account-create-update-2chx5\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.665403 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ee938f-c401-44b5-aaa8-40654d2217ea-operator-scripts\") pod \"neutron-db-create-6pxbt\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.698504 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gxvf\" (UniqueName: \"kubernetes.io/projected/b2ee938f-c401-44b5-aaa8-40654d2217ea-kube-api-access-5gxvf\") pod \"neutron-db-create-6pxbt\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.765959 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72f2\" (UniqueName: \"kubernetes.io/projected/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-kube-api-access-n72f2\") pod \"neutron-c394-account-create-update-2chx5\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.766057 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-operator-scripts\") pod \"neutron-c394-account-create-update-2chx5\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.766737 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-operator-scripts\") pod \"neutron-c394-account-create-update-2chx5\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.783367 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.798962 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72f2\" (UniqueName: \"kubernetes.io/projected/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-kube-api-access-n72f2\") pod \"neutron-c394-account-create-update-2chx5\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:47 crc kubenswrapper[4823]: I1216 08:54:47.874329 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:48 crc kubenswrapper[4823]: I1216 08:54:48.281307 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6pxbt"] Dec 16 08:54:48 crc kubenswrapper[4823]: I1216 08:54:48.372768 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c394-account-create-update-2chx5"] Dec 16 08:54:48 crc kubenswrapper[4823]: W1216 08:54:48.394682 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a0cc5c4_4848_45e7_80b2_1f8bc385064b.slice/crio-b5b93860231a29829a632079a1aa250809ca39412c3a243ef5e8d6fb63bea691 WatchSource:0}: Error finding container b5b93860231a29829a632079a1aa250809ca39412c3a243ef5e8d6fb63bea691: Status 404 returned error can't find the container with id b5b93860231a29829a632079a1aa250809ca39412c3a243ef5e8d6fb63bea691 Dec 16 08:54:49 crc kubenswrapper[4823]: I1216 08:54:49.035272 4823 generic.go:334] "Generic (PLEG): container finished" podID="8a0cc5c4-4848-45e7-80b2-1f8bc385064b" containerID="bf0a2dbbafb50fa3bc8becf59e754068301fe0b4d4c4db2712ebe2dd0cf00906" exitCode=0 Dec 16 08:54:49 crc kubenswrapper[4823]: I1216 08:54:49.035392 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c394-account-create-update-2chx5" event={"ID":"8a0cc5c4-4848-45e7-80b2-1f8bc385064b","Type":"ContainerDied","Data":"bf0a2dbbafb50fa3bc8becf59e754068301fe0b4d4c4db2712ebe2dd0cf00906"} Dec 16 08:54:49 crc kubenswrapper[4823]: I1216 08:54:49.036928 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c394-account-create-update-2chx5" event={"ID":"8a0cc5c4-4848-45e7-80b2-1f8bc385064b","Type":"ContainerStarted","Data":"b5b93860231a29829a632079a1aa250809ca39412c3a243ef5e8d6fb63bea691"} Dec 16 08:54:49 crc kubenswrapper[4823]: I1216 08:54:49.038743 4823 generic.go:334] "Generic (PLEG): container finished" podID="b2ee938f-c401-44b5-aaa8-40654d2217ea" containerID="384accd749147f19fd68cceb4a058774a1097c1202d7bd5ae02b4ddb42c1eecb" exitCode=0 Dec 16 08:54:49 crc kubenswrapper[4823]: I1216 08:54:49.038810 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6pxbt" event={"ID":"b2ee938f-c401-44b5-aaa8-40654d2217ea","Type":"ContainerDied","Data":"384accd749147f19fd68cceb4a058774a1097c1202d7bd5ae02b4ddb42c1eecb"} Dec 16 08:54:49 crc kubenswrapper[4823]: I1216 08:54:49.038836 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6pxbt" event={"ID":"b2ee938f-c401-44b5-aaa8-40654d2217ea","Type":"ContainerStarted","Data":"82ce7797d971f556264c8ad959186e68398af570c63d02ede93600f5ae0e3520"} Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.407924 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.420709 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.530893 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ee938f-c401-44b5-aaa8-40654d2217ea-operator-scripts\") pod \"b2ee938f-c401-44b5-aaa8-40654d2217ea\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.530975 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72f2\" (UniqueName: \"kubernetes.io/projected/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-kube-api-access-n72f2\") pod \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.531080 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gxvf\" (UniqueName: \"kubernetes.io/projected/b2ee938f-c401-44b5-aaa8-40654d2217ea-kube-api-access-5gxvf\") pod \"b2ee938f-c401-44b5-aaa8-40654d2217ea\" (UID: \"b2ee938f-c401-44b5-aaa8-40654d2217ea\") " Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.531429 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-operator-scripts\") pod \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\" (UID: \"8a0cc5c4-4848-45e7-80b2-1f8bc385064b\") " Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.531924 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ee938f-c401-44b5-aaa8-40654d2217ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2ee938f-c401-44b5-aaa8-40654d2217ea" (UID: "b2ee938f-c401-44b5-aaa8-40654d2217ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.532417 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a0cc5c4-4848-45e7-80b2-1f8bc385064b" (UID: "8a0cc5c4-4848-45e7-80b2-1f8bc385064b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.532546 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ee938f-c401-44b5-aaa8-40654d2217ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.540732 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ee938f-c401-44b5-aaa8-40654d2217ea-kube-api-access-5gxvf" (OuterVolumeSpecName: "kube-api-access-5gxvf") pod "b2ee938f-c401-44b5-aaa8-40654d2217ea" (UID: "b2ee938f-c401-44b5-aaa8-40654d2217ea"). InnerVolumeSpecName "kube-api-access-5gxvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.541526 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-kube-api-access-n72f2" (OuterVolumeSpecName: "kube-api-access-n72f2") pod "8a0cc5c4-4848-45e7-80b2-1f8bc385064b" (UID: "8a0cc5c4-4848-45e7-80b2-1f8bc385064b"). InnerVolumeSpecName "kube-api-access-n72f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.633736 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72f2\" (UniqueName: \"kubernetes.io/projected/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-kube-api-access-n72f2\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.633772 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gxvf\" (UniqueName: \"kubernetes.io/projected/b2ee938f-c401-44b5-aaa8-40654d2217ea-kube-api-access-5gxvf\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.633786 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a0cc5c4-4848-45e7-80b2-1f8bc385064b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:50 crc kubenswrapper[4823]: I1216 08:54:50.772594 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:54:50 crc kubenswrapper[4823]: E1216 08:54:50.772862 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:54:51 crc kubenswrapper[4823]: I1216 08:54:51.061249 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6pxbt" event={"ID":"b2ee938f-c401-44b5-aaa8-40654d2217ea","Type":"ContainerDied","Data":"82ce7797d971f556264c8ad959186e68398af570c63d02ede93600f5ae0e3520"} Dec 16 08:54:51 crc kubenswrapper[4823]: I1216 08:54:51.061301 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxbt" Dec 16 08:54:51 crc kubenswrapper[4823]: I1216 08:54:51.061317 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ce7797d971f556264c8ad959186e68398af570c63d02ede93600f5ae0e3520" Dec 16 08:54:51 crc kubenswrapper[4823]: I1216 08:54:51.064116 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c394-account-create-update-2chx5" event={"ID":"8a0cc5c4-4848-45e7-80b2-1f8bc385064b","Type":"ContainerDied","Data":"b5b93860231a29829a632079a1aa250809ca39412c3a243ef5e8d6fb63bea691"} Dec 16 08:54:51 crc kubenswrapper[4823]: I1216 08:54:51.064185 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c394-account-create-update-2chx5" Dec 16 08:54:51 crc kubenswrapper[4823]: I1216 08:54:51.064189 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b93860231a29829a632079a1aa250809ca39412c3a243ef5e8d6fb63bea691" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.869472 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8wd5f"] Dec 16 08:54:52 crc kubenswrapper[4823]: E1216 08:54:52.870089 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cc5c4-4848-45e7-80b2-1f8bc385064b" containerName="mariadb-account-create-update" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.870113 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cc5c4-4848-45e7-80b2-1f8bc385064b" containerName="mariadb-account-create-update" Dec 16 08:54:52 crc kubenswrapper[4823]: E1216 08:54:52.870129 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ee938f-c401-44b5-aaa8-40654d2217ea" containerName="mariadb-database-create" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.870135 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ee938f-c401-44b5-aaa8-40654d2217ea" containerName="mariadb-database-create" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.870282 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ee938f-c401-44b5-aaa8-40654d2217ea" containerName="mariadb-database-create" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.870299 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0cc5c4-4848-45e7-80b2-1f8bc385064b" containerName="mariadb-account-create-update" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.870907 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.874961 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.875468 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.875588 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-25vch" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.880746 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8wd5f"] Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.974410 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chnwm\" (UniqueName: \"kubernetes.io/projected/891280f8-2910-413c-aab0-818e6ee7cc7c-kube-api-access-chnwm\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.974487 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-combined-ca-bundle\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:52 crc kubenswrapper[4823]: I1216 08:54:52.974826 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-config\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.076682 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-config\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.076755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnwm\" (UniqueName: \"kubernetes.io/projected/891280f8-2910-413c-aab0-818e6ee7cc7c-kube-api-access-chnwm\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.076786 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-combined-ca-bundle\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.083736 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-config\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.083959 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-combined-ca-bundle\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.107116 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnwm\" (UniqueName: \"kubernetes.io/projected/891280f8-2910-413c-aab0-818e6ee7cc7c-kube-api-access-chnwm\") pod \"neutron-db-sync-8wd5f\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.203664 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:53 crc kubenswrapper[4823]: I1216 08:54:53.818157 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8wd5f"] Dec 16 08:54:54 crc kubenswrapper[4823]: I1216 08:54:54.090139 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8wd5f" event={"ID":"891280f8-2910-413c-aab0-818e6ee7cc7c","Type":"ContainerStarted","Data":"df0ad64ecd92e4d3376daa89ce790d37811dc4d8cb6c83f872e6c2e122995705"} Dec 16 08:54:55 crc kubenswrapper[4823]: I1216 08:54:55.099187 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8wd5f" event={"ID":"891280f8-2910-413c-aab0-818e6ee7cc7c","Type":"ContainerStarted","Data":"4f41f29f42bb5b92497e6cdca74dd5b4d9e606a6e5a1ae0e5dc9cb55b58b1a17"} Dec 16 08:54:55 crc kubenswrapper[4823]: I1216 08:54:55.122209 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8wd5f" podStartSLOduration=3.122190839 podStartE2EDuration="3.122190839s" podCreationTimestamp="2025-12-16 08:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:54:55.114922621 +0000 UTC m=+7173.603488754" watchObservedRunningTime="2025-12-16 08:54:55.122190839 +0000 UTC m=+7173.610756952" Dec 16 08:54:58 crc kubenswrapper[4823]: I1216 08:54:58.131491 4823 generic.go:334] "Generic (PLEG): container finished" podID="891280f8-2910-413c-aab0-818e6ee7cc7c" containerID="4f41f29f42bb5b92497e6cdca74dd5b4d9e606a6e5a1ae0e5dc9cb55b58b1a17" exitCode=0 Dec 16 08:54:58 crc kubenswrapper[4823]: I1216 08:54:58.131597 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8wd5f" event={"ID":"891280f8-2910-413c-aab0-818e6ee7cc7c","Type":"ContainerDied","Data":"4f41f29f42bb5b92497e6cdca74dd5b4d9e606a6e5a1ae0e5dc9cb55b58b1a17"} Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.532916 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.699484 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-combined-ca-bundle\") pod \"891280f8-2910-413c-aab0-818e6ee7cc7c\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.699562 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-config\") pod \"891280f8-2910-413c-aab0-818e6ee7cc7c\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.699852 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chnwm\" (UniqueName: \"kubernetes.io/projected/891280f8-2910-413c-aab0-818e6ee7cc7c-kube-api-access-chnwm\") pod \"891280f8-2910-413c-aab0-818e6ee7cc7c\" (UID: \"891280f8-2910-413c-aab0-818e6ee7cc7c\") " Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.707291 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891280f8-2910-413c-aab0-818e6ee7cc7c-kube-api-access-chnwm" (OuterVolumeSpecName: "kube-api-access-chnwm") pod "891280f8-2910-413c-aab0-818e6ee7cc7c" (UID: "891280f8-2910-413c-aab0-818e6ee7cc7c"). InnerVolumeSpecName "kube-api-access-chnwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.732451 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-config" (OuterVolumeSpecName: "config") pod "891280f8-2910-413c-aab0-818e6ee7cc7c" (UID: "891280f8-2910-413c-aab0-818e6ee7cc7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.749210 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "891280f8-2910-413c-aab0-818e6ee7cc7c" (UID: "891280f8-2910-413c-aab0-818e6ee7cc7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.802159 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chnwm\" (UniqueName: \"kubernetes.io/projected/891280f8-2910-413c-aab0-818e6ee7cc7c-kube-api-access-chnwm\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.802211 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:54:59 crc kubenswrapper[4823]: I1216 08:54:59.802230 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/891280f8-2910-413c-aab0-818e6ee7cc7c-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.154969 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8wd5f" event={"ID":"891280f8-2910-413c-aab0-818e6ee7cc7c","Type":"ContainerDied","Data":"df0ad64ecd92e4d3376daa89ce790d37811dc4d8cb6c83f872e6c2e122995705"} Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.155037 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8wd5f" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.155055 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0ad64ecd92e4d3376daa89ce790d37811dc4d8cb6c83f872e6c2e122995705" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.321490 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-649fcfbf79-f4wz7"] Dec 16 08:55:00 crc kubenswrapper[4823]: E1216 08:55:00.321960 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891280f8-2910-413c-aab0-818e6ee7cc7c" containerName="neutron-db-sync" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.321979 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="891280f8-2910-413c-aab0-818e6ee7cc7c" containerName="neutron-db-sync" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.322219 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="891280f8-2910-413c-aab0-818e6ee7cc7c" containerName="neutron-db-sync" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.323250 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.334793 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649fcfbf79-f4wz7"] Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.453390 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bfbd64d5b-8chch"] Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.454661 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.456662 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.456843 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.456963 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-25vch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.458097 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.473880 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bfbd64d5b-8chch"] Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.514733 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-config\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.514788 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-sb\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.515195 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmc46\" (UniqueName: \"kubernetes.io/projected/9368a023-0217-4537-9520-90b6b092997c-kube-api-access-dmc46\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.515253 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-nb\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.515294 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-dns-svc\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617102 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-combined-ca-bundle\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617147 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-config\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-sb\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617204 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7wr\" (UniqueName: \"kubernetes.io/projected/c7a34646-6156-46b7-be55-3b645e543cb0-kube-api-access-xn7wr\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617225 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-ovndb-tls-certs\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617274 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-config\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617338 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmc46\" (UniqueName: \"kubernetes.io/projected/9368a023-0217-4537-9520-90b6b092997c-kube-api-access-dmc46\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617361 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-nb\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617381 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-dns-svc\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.617400 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-httpd-config\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.618259 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-config\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.618762 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-sb\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.619525 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-nb\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.620014 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-dns-svc\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.637959 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmc46\" (UniqueName: \"kubernetes.io/projected/9368a023-0217-4537-9520-90b6b092997c-kube-api-access-dmc46\") pod \"dnsmasq-dns-649fcfbf79-f4wz7\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.649830 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.722128 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-httpd-config\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.722407 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-combined-ca-bundle\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.722456 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7wr\" (UniqueName: \"kubernetes.io/projected/c7a34646-6156-46b7-be55-3b645e543cb0-kube-api-access-xn7wr\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.722481 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-ovndb-tls-certs\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.722557 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-config\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.740698 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-ovndb-tls-certs\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.740706 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-combined-ca-bundle\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.740823 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-httpd-config\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.742896 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-config\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.754254 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7wr\" (UniqueName: \"kubernetes.io/projected/c7a34646-6156-46b7-be55-3b645e543cb0-kube-api-access-xn7wr\") pod \"neutron-7bfbd64d5b-8chch\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.767814 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:00 crc kubenswrapper[4823]: I1216 08:55:00.967950 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649fcfbf79-f4wz7"] Dec 16 08:55:01 crc kubenswrapper[4823]: I1216 08:55:01.161948 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" event={"ID":"9368a023-0217-4537-9520-90b6b092997c","Type":"ContainerStarted","Data":"a37500baf946182e6c82d510dd86dcb37bd694f213f06dad9d1d6d370eb1695f"} Dec 16 08:55:01 crc kubenswrapper[4823]: I1216 08:55:01.328482 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bfbd64d5b-8chch"] Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.171260 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfbd64d5b-8chch" event={"ID":"c7a34646-6156-46b7-be55-3b645e543cb0","Type":"ContainerStarted","Data":"7682758c0683a0037e7e77a9cfc2e4339b49134d14d2c2ee297be9cbf6d6f8c4"} Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.171616 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfbd64d5b-8chch" event={"ID":"c7a34646-6156-46b7-be55-3b645e543cb0","Type":"ContainerStarted","Data":"b086b1249fc8dd3476e07462438e774476f680a429952a2d5cb148b8b8dcefe4"} Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.171633 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfbd64d5b-8chch" event={"ID":"c7a34646-6156-46b7-be55-3b645e543cb0","Type":"ContainerStarted","Data":"550b2088035307d8b79ce2187335f5ebe09f6d59fb9b8c3ca548b2364c2d856f"} Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.171651 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.175100 4823 generic.go:334] "Generic (PLEG): container finished" podID="9368a023-0217-4537-9520-90b6b092997c" containerID="1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605" exitCode=0 Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.175142 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" event={"ID":"9368a023-0217-4537-9520-90b6b092997c","Type":"ContainerDied","Data":"1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605"} Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.196595 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bfbd64d5b-8chch" podStartSLOduration=2.196577541 podStartE2EDuration="2.196577541s" podCreationTimestamp="2025-12-16 08:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:55:02.195676682 +0000 UTC m=+7180.684242835" watchObservedRunningTime="2025-12-16 08:55:02.196577541 +0000 UTC m=+7180.685143664" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.611312 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6ffc876c99-shbwd"] Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.612839 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.616237 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.617674 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.622633 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ffc876c99-shbwd"] Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.767619 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-ovndb-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.767765 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-internal-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.767818 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-public-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.768034 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-combined-ca-bundle\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.768255 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz52k\" (UniqueName: \"kubernetes.io/projected/83abe53b-780a-4255-b2a8-22f3480c9358-kube-api-access-xz52k\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.768289 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-config\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.768354 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-httpd-config\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869578 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz52k\" (UniqueName: \"kubernetes.io/projected/83abe53b-780a-4255-b2a8-22f3480c9358-kube-api-access-xz52k\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869644 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-config\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869699 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-httpd-config\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869758 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-ovndb-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869790 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-internal-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869816 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-public-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.869855 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-combined-ca-bundle\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.874051 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-config\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.874385 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-combined-ca-bundle\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.874760 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-ovndb-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.875255 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-internal-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.875730 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-httpd-config\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.878220 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-public-tls-certs\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.892107 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz52k\" (UniqueName: \"kubernetes.io/projected/83abe53b-780a-4255-b2a8-22f3480c9358-kube-api-access-xz52k\") pod \"neutron-6ffc876c99-shbwd\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:02 crc kubenswrapper[4823]: I1216 08:55:02.943164 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:03 crc kubenswrapper[4823]: I1216 08:55:03.184768 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" event={"ID":"9368a023-0217-4537-9520-90b6b092997c","Type":"ContainerStarted","Data":"bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9"} Dec 16 08:55:03 crc kubenswrapper[4823]: I1216 08:55:03.479895 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" podStartSLOduration=3.479872985 podStartE2EDuration="3.479872985s" podCreationTimestamp="2025-12-16 08:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:55:03.215526859 +0000 UTC m=+7181.704092992" watchObservedRunningTime="2025-12-16 08:55:03.479872985 +0000 UTC m=+7181.968439108" Dec 16 08:55:03 crc kubenswrapper[4823]: I1216 08:55:03.485971 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ffc876c99-shbwd"] Dec 16 08:55:03 crc kubenswrapper[4823]: W1216 08:55:03.495208 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83abe53b_780a_4255_b2a8_22f3480c9358.slice/crio-fa22742fe41b55d6771e69ed637d578d45d778b96a01e75db53d42f9c43ea22d WatchSource:0}: Error finding container fa22742fe41b55d6771e69ed637d578d45d778b96a01e75db53d42f9c43ea22d: Status 404 returned error can't find the container with id fa22742fe41b55d6771e69ed637d578d45d778b96a01e75db53d42f9c43ea22d Dec 16 08:55:03 crc kubenswrapper[4823]: I1216 08:55:03.774718 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:55:03 crc kubenswrapper[4823]: E1216 08:55:03.775312 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:55:04 crc kubenswrapper[4823]: I1216 08:55:04.193933 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffc876c99-shbwd" event={"ID":"83abe53b-780a-4255-b2a8-22f3480c9358","Type":"ContainerStarted","Data":"be736c54eb1998c9f331d4dd1c7970f56f4f491d0243de972bd6f9e630a78177"} Dec 16 08:55:04 crc kubenswrapper[4823]: I1216 08:55:04.193983 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffc876c99-shbwd" event={"ID":"83abe53b-780a-4255-b2a8-22f3480c9358","Type":"ContainerStarted","Data":"bc5f650dbf19a065a416224d2c46c8451ed1939c757afbeb47b34a826f25043b"} Dec 16 08:55:04 crc kubenswrapper[4823]: I1216 08:55:04.193994 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffc876c99-shbwd" event={"ID":"83abe53b-780a-4255-b2a8-22f3480c9358","Type":"ContainerStarted","Data":"fa22742fe41b55d6771e69ed637d578d45d778b96a01e75db53d42f9c43ea22d"} Dec 16 08:55:04 crc kubenswrapper[4823]: I1216 08:55:04.194340 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:04 crc kubenswrapper[4823]: I1216 08:55:04.194395 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:10 crc kubenswrapper[4823]: I1216 08:55:10.651329 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:55:10 crc kubenswrapper[4823]: I1216 08:55:10.678651 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6ffc876c99-shbwd" podStartSLOduration=8.6786159 podStartE2EDuration="8.6786159s" podCreationTimestamp="2025-12-16 08:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:55:04.214265094 +0000 UTC m=+7182.702831247" watchObservedRunningTime="2025-12-16 08:55:10.6786159 +0000 UTC m=+7189.167182063" Dec 16 08:55:10 crc kubenswrapper[4823]: I1216 08:55:10.756120 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f88b6949f-mhwpm"] Dec 16 08:55:10 crc kubenswrapper[4823]: I1216 08:55:10.756404 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerName="dnsmasq-dns" containerID="cri-o://279887000a49b31ce5520a85a264f1dff04c4dbdf1c1ddc32ea6391952cfbe69" gracePeriod=10 Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.261508 4823 generic.go:334] "Generic (PLEG): container finished" podID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerID="279887000a49b31ce5520a85a264f1dff04c4dbdf1c1ddc32ea6391952cfbe69" exitCode=0 Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.261578 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" event={"ID":"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d","Type":"ContainerDied","Data":"279887000a49b31ce5520a85a264f1dff04c4dbdf1c1ddc32ea6391952cfbe69"} Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.261866 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" event={"ID":"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d","Type":"ContainerDied","Data":"01cb67d3598f41b1f0077f6e6ffddb26f86238c701930e2329f94ebbeb9fd627"} Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.262084 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01cb67d3598f41b1f0077f6e6ffddb26f86238c701930e2329f94ebbeb9fd627" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.284278 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.443489 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-nb\") pod \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.443635 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t2dw\" (UniqueName: \"kubernetes.io/projected/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-kube-api-access-4t2dw\") pod \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.443690 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-sb\") pod \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.443831 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-config\") pod \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.443960 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-dns-svc\") pod \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\" (UID: \"4a97ec9a-cfd5-4824-bf42-5c29cafafb3d\") " Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.456295 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-kube-api-access-4t2dw" (OuterVolumeSpecName: "kube-api-access-4t2dw") pod "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" (UID: "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d"). InnerVolumeSpecName "kube-api-access-4t2dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.493265 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" (UID: "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.493683 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" (UID: "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.498784 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" (UID: "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.512189 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-config" (OuterVolumeSpecName: "config") pod "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" (UID: "4a97ec9a-cfd5-4824-bf42-5c29cafafb3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.546129 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.546170 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.546182 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t2dw\" (UniqueName: \"kubernetes.io/projected/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-kube-api-access-4t2dw\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.546191 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:11 crc kubenswrapper[4823]: I1216 08:55:11.546205 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:12 crc kubenswrapper[4823]: I1216 08:55:12.274168 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f88b6949f-mhwpm" Dec 16 08:55:12 crc kubenswrapper[4823]: I1216 08:55:12.310836 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f88b6949f-mhwpm"] Dec 16 08:55:12 crc kubenswrapper[4823]: I1216 08:55:12.322646 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f88b6949f-mhwpm"] Dec 16 08:55:13 crc kubenswrapper[4823]: I1216 08:55:13.785564 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" path="/var/lib/kubelet/pods/4a97ec9a-cfd5-4824-bf42-5c29cafafb3d/volumes" Dec 16 08:55:18 crc kubenswrapper[4823]: I1216 08:55:18.771773 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:55:18 crc kubenswrapper[4823]: E1216 08:55:18.772592 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.581912 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4jkvj"] Dec 16 08:55:26 crc kubenswrapper[4823]: E1216 08:55:26.584591 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerName="init" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.584620 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerName="init" Dec 16 08:55:26 crc kubenswrapper[4823]: E1216 08:55:26.584642 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerName="dnsmasq-dns" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.584657 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerName="dnsmasq-dns" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.584931 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a97ec9a-cfd5-4824-bf42-5c29cafafb3d" containerName="dnsmasq-dns" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.588069 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.603181 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jkvj"] Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.718988 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-utilities\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.719096 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-catalog-content\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.719217 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7htd\" (UniqueName: \"kubernetes.io/projected/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-kube-api-access-p7htd\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.821291 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-catalog-content\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.821407 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7htd\" (UniqueName: \"kubernetes.io/projected/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-kube-api-access-p7htd\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.821495 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-utilities\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.821849 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-catalog-content\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.821909 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-utilities\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.844597 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7htd\" (UniqueName: \"kubernetes.io/projected/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-kube-api-access-p7htd\") pod \"redhat-marketplace-4jkvj\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:26 crc kubenswrapper[4823]: I1216 08:55:26.910784 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:27 crc kubenswrapper[4823]: I1216 08:55:27.420304 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jkvj"] Dec 16 08:55:27 crc kubenswrapper[4823]: I1216 08:55:27.533099 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jkvj" event={"ID":"e3e810be-b9e6-41ce-af9d-e9ca45c7f429","Type":"ContainerStarted","Data":"1d24e320fc57b32d3184c4968890fb01570710a66953ea89f8fe94540103705d"} Dec 16 08:55:28 crc kubenswrapper[4823]: I1216 08:55:28.542348 4823 generic.go:334] "Generic (PLEG): container finished" podID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerID="0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291" exitCode=0 Dec 16 08:55:28 crc kubenswrapper[4823]: I1216 08:55:28.542411 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jkvj" event={"ID":"e3e810be-b9e6-41ce-af9d-e9ca45c7f429","Type":"ContainerDied","Data":"0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291"} Dec 16 08:55:30 crc kubenswrapper[4823]: I1216 08:55:30.562375 4823 generic.go:334] "Generic (PLEG): container finished" podID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerID="0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843" exitCode=0 Dec 16 08:55:30 crc kubenswrapper[4823]: I1216 08:55:30.562475 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jkvj" event={"ID":"e3e810be-b9e6-41ce-af9d-e9ca45c7f429","Type":"ContainerDied","Data":"0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843"} Dec 16 08:55:30 crc kubenswrapper[4823]: I1216 08:55:30.782180 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.161387 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nwbdc"] Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.163717 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.171477 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwbdc"] Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.303558 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-utilities\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.303902 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6477q\" (UniqueName: \"kubernetes.io/projected/b2efa33c-f9b1-44d2-9925-5417a4391430-kube-api-access-6477q\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.304002 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-catalog-content\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.405625 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-utilities\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.405703 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6477q\" (UniqueName: \"kubernetes.io/projected/b2efa33c-f9b1-44d2-9925-5417a4391430-kube-api-access-6477q\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.405793 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-catalog-content\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.406374 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-catalog-content\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.406677 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-utilities\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.432705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6477q\" (UniqueName: \"kubernetes.io/projected/b2efa33c-f9b1-44d2-9925-5417a4391430-kube-api-access-6477q\") pod \"redhat-operators-nwbdc\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.494400 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.573514 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jkvj" event={"ID":"e3e810be-b9e6-41ce-af9d-e9ca45c7f429","Type":"ContainerStarted","Data":"90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8"} Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.610384 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4jkvj" podStartSLOduration=3.000523375 podStartE2EDuration="5.610366116s" podCreationTimestamp="2025-12-16 08:55:26 +0000 UTC" firstStartedPulling="2025-12-16 08:55:28.545083687 +0000 UTC m=+7207.033649840" lastFinishedPulling="2025-12-16 08:55:31.154926458 +0000 UTC m=+7209.643492581" observedRunningTime="2025-12-16 08:55:31.605529164 +0000 UTC m=+7210.094095287" watchObservedRunningTime="2025-12-16 08:55:31.610366116 +0000 UTC m=+7210.098932239" Dec 16 08:55:31 crc kubenswrapper[4823]: I1216 08:55:31.990204 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwbdc"] Dec 16 08:55:32 crc kubenswrapper[4823]: I1216 08:55:32.583309 4823 generic.go:334] "Generic (PLEG): container finished" podID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerID="8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759" exitCode=0 Dec 16 08:55:32 crc kubenswrapper[4823]: I1216 08:55:32.583404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerDied","Data":"8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759"} Dec 16 08:55:32 crc kubenswrapper[4823]: I1216 08:55:32.583667 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerStarted","Data":"de9a6e5be7f19930d110db13f092b21ab75a90059c8fa637d19a61c772b7ad17"} Dec 16 08:55:32 crc kubenswrapper[4823]: I1216 08:55:32.772245 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:55:32 crc kubenswrapper[4823]: E1216 08:55:32.772564 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:55:32 crc kubenswrapper[4823]: I1216 08:55:32.956840 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 08:55:33 crc kubenswrapper[4823]: I1216 08:55:33.025056 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bfbd64d5b-8chch"] Dec 16 08:55:33 crc kubenswrapper[4823]: I1216 08:55:33.025504 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bfbd64d5b-8chch" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-api" containerID="cri-o://b086b1249fc8dd3476e07462438e774476f680a429952a2d5cb148b8b8dcefe4" gracePeriod=30 Dec 16 08:55:33 crc kubenswrapper[4823]: I1216 08:55:33.025687 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bfbd64d5b-8chch" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-httpd" containerID="cri-o://7682758c0683a0037e7e77a9cfc2e4339b49134d14d2c2ee297be9cbf6d6f8c4" gracePeriod=30 Dec 16 08:55:33 crc kubenswrapper[4823]: I1216 08:55:33.593882 4823 generic.go:334] "Generic (PLEG): container finished" podID="c7a34646-6156-46b7-be55-3b645e543cb0" containerID="7682758c0683a0037e7e77a9cfc2e4339b49134d14d2c2ee297be9cbf6d6f8c4" exitCode=0 Dec 16 08:55:33 crc kubenswrapper[4823]: I1216 08:55:33.593949 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfbd64d5b-8chch" event={"ID":"c7a34646-6156-46b7-be55-3b645e543cb0","Type":"ContainerDied","Data":"7682758c0683a0037e7e77a9cfc2e4339b49134d14d2c2ee297be9cbf6d6f8c4"} Dec 16 08:55:34 crc kubenswrapper[4823]: I1216 08:55:34.613417 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerStarted","Data":"f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7"} Dec 16 08:55:35 crc kubenswrapper[4823]: I1216 08:55:35.623177 4823 generic.go:334] "Generic (PLEG): container finished" podID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerID="f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7" exitCode=0 Dec 16 08:55:35 crc kubenswrapper[4823]: I1216 08:55:35.623238 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerDied","Data":"f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7"} Dec 16 08:55:36 crc kubenswrapper[4823]: I1216 08:55:36.913872 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:36 crc kubenswrapper[4823]: I1216 08:55:36.914208 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:36 crc kubenswrapper[4823]: I1216 08:55:36.977464 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:37 crc kubenswrapper[4823]: I1216 08:55:37.641564 4823 generic.go:334] "Generic (PLEG): container finished" podID="c7a34646-6156-46b7-be55-3b645e543cb0" containerID="b086b1249fc8dd3476e07462438e774476f680a429952a2d5cb148b8b8dcefe4" exitCode=0 Dec 16 08:55:37 crc kubenswrapper[4823]: I1216 08:55:37.641655 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfbd64d5b-8chch" event={"ID":"c7a34646-6156-46b7-be55-3b645e543cb0","Type":"ContainerDied","Data":"b086b1249fc8dd3476e07462438e774476f680a429952a2d5cb148b8b8dcefe4"} Dec 16 08:55:37 crc kubenswrapper[4823]: I1216 08:55:37.688783 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:37 crc kubenswrapper[4823]: I1216 08:55:37.962057 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.026251 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7wr\" (UniqueName: \"kubernetes.io/projected/c7a34646-6156-46b7-be55-3b645e543cb0-kube-api-access-xn7wr\") pod \"c7a34646-6156-46b7-be55-3b645e543cb0\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.026353 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-ovndb-tls-certs\") pod \"c7a34646-6156-46b7-be55-3b645e543cb0\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.026407 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-combined-ca-bundle\") pod \"c7a34646-6156-46b7-be55-3b645e543cb0\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.026505 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-httpd-config\") pod \"c7a34646-6156-46b7-be55-3b645e543cb0\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.026542 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-config\") pod \"c7a34646-6156-46b7-be55-3b645e543cb0\" (UID: \"c7a34646-6156-46b7-be55-3b645e543cb0\") " Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.034231 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a34646-6156-46b7-be55-3b645e543cb0-kube-api-access-xn7wr" (OuterVolumeSpecName: "kube-api-access-xn7wr") pod "c7a34646-6156-46b7-be55-3b645e543cb0" (UID: "c7a34646-6156-46b7-be55-3b645e543cb0"). InnerVolumeSpecName "kube-api-access-xn7wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.041597 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c7a34646-6156-46b7-be55-3b645e543cb0" (UID: "c7a34646-6156-46b7-be55-3b645e543cb0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.071595 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-config" (OuterVolumeSpecName: "config") pod "c7a34646-6156-46b7-be55-3b645e543cb0" (UID: "c7a34646-6156-46b7-be55-3b645e543cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.075617 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7a34646-6156-46b7-be55-3b645e543cb0" (UID: "c7a34646-6156-46b7-be55-3b645e543cb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.100826 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c7a34646-6156-46b7-be55-3b645e543cb0" (UID: "c7a34646-6156-46b7-be55-3b645e543cb0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.129469 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.129512 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn7wr\" (UniqueName: \"kubernetes.io/projected/c7a34646-6156-46b7-be55-3b645e543cb0-kube-api-access-xn7wr\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.129527 4823 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.129538 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.129550 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7a34646-6156-46b7-be55-3b645e543cb0-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.147693 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jkvj"] Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.651477 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerStarted","Data":"97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede"} Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.653699 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bfbd64d5b-8chch" event={"ID":"c7a34646-6156-46b7-be55-3b645e543cb0","Type":"ContainerDied","Data":"550b2088035307d8b79ce2187335f5ebe09f6d59fb9b8c3ca548b2364c2d856f"} Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.653781 4823 scope.go:117] "RemoveContainer" containerID="7682758c0683a0037e7e77a9cfc2e4339b49134d14d2c2ee297be9cbf6d6f8c4" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.653715 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bfbd64d5b-8chch" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.675794 4823 scope.go:117] "RemoveContainer" containerID="b086b1249fc8dd3476e07462438e774476f680a429952a2d5cb148b8b8dcefe4" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.684143 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nwbdc" podStartSLOduration=2.174087638 podStartE2EDuration="7.684117699s" podCreationTimestamp="2025-12-16 08:55:31 +0000 UTC" firstStartedPulling="2025-12-16 08:55:32.585212583 +0000 UTC m=+7211.073778706" lastFinishedPulling="2025-12-16 08:55:38.095242644 +0000 UTC m=+7216.583808767" observedRunningTime="2025-12-16 08:55:38.675758887 +0000 UTC m=+7217.164325020" watchObservedRunningTime="2025-12-16 08:55:38.684117699 +0000 UTC m=+7217.172683842" Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.700814 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bfbd64d5b-8chch"] Dec 16 08:55:38 crc kubenswrapper[4823]: I1216 08:55:38.710473 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bfbd64d5b-8chch"] Dec 16 08:55:39 crc kubenswrapper[4823]: I1216 08:55:39.662818 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4jkvj" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="registry-server" containerID="cri-o://90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8" gracePeriod=2 Dec 16 08:55:39 crc kubenswrapper[4823]: I1216 08:55:39.785098 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" path="/var/lib/kubelet/pods/c7a34646-6156-46b7-be55-3b645e543cb0/volumes" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.653393 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.688416 4823 generic.go:334] "Generic (PLEG): container finished" podID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerID="90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8" exitCode=0 Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.688460 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jkvj" event={"ID":"e3e810be-b9e6-41ce-af9d-e9ca45c7f429","Type":"ContainerDied","Data":"90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8"} Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.688491 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jkvj" event={"ID":"e3e810be-b9e6-41ce-af9d-e9ca45c7f429","Type":"ContainerDied","Data":"1d24e320fc57b32d3184c4968890fb01570710a66953ea89f8fe94540103705d"} Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.688511 4823 scope.go:117] "RemoveContainer" containerID="90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.688694 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jkvj" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.707413 4823 scope.go:117] "RemoveContainer" containerID="0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.726242 4823 scope.go:117] "RemoveContainer" containerID="0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.767297 4823 scope.go:117] "RemoveContainer" containerID="90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8" Dec 16 08:55:41 crc kubenswrapper[4823]: E1216 08:55:40.767705 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8\": container with ID starting with 90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8 not found: ID does not exist" containerID="90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.767740 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8"} err="failed to get container status \"90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8\": rpc error: code = NotFound desc = could not find container \"90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8\": container with ID starting with 90b0086309a783d92377a40ad51db12e52429af5d0af33e698523f20e3c29ec8 not found: ID does not exist" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.767769 4823 scope.go:117] "RemoveContainer" containerID="0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843" Dec 16 08:55:41 crc kubenswrapper[4823]: E1216 08:55:40.768002 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843\": container with ID starting with 0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843 not found: ID does not exist" containerID="0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.768037 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843"} err="failed to get container status \"0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843\": rpc error: code = NotFound desc = could not find container \"0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843\": container with ID starting with 0fc76d91fbb96ac3988d3b3579b78c99811a89e1ea14ccee415e423f175c4843 not found: ID does not exist" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.768052 4823 scope.go:117] "RemoveContainer" containerID="0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291" Dec 16 08:55:41 crc kubenswrapper[4823]: E1216 08:55:40.768250 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291\": container with ID starting with 0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291 not found: ID does not exist" containerID="0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.768268 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291"} err="failed to get container status \"0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291\": rpc error: code = NotFound desc = could not find container \"0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291\": container with ID starting with 0918be923247b8aff53cd2144bbd2b5767a0e3d08642a7ab9513b101635a1291 not found: ID does not exist" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.779566 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-catalog-content\") pod \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.779650 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7htd\" (UniqueName: \"kubernetes.io/projected/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-kube-api-access-p7htd\") pod \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.779757 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-utilities\") pod \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\" (UID: \"e3e810be-b9e6-41ce-af9d-e9ca45c7f429\") " Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.784598 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-utilities" (OuterVolumeSpecName: "utilities") pod "e3e810be-b9e6-41ce-af9d-e9ca45c7f429" (UID: "e3e810be-b9e6-41ce-af9d-e9ca45c7f429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.790167 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-kube-api-access-p7htd" (OuterVolumeSpecName: "kube-api-access-p7htd") pod "e3e810be-b9e6-41ce-af9d-e9ca45c7f429" (UID: "e3e810be-b9e6-41ce-af9d-e9ca45c7f429"). InnerVolumeSpecName "kube-api-access-p7htd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.809041 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3e810be-b9e6-41ce-af9d-e9ca45c7f429" (UID: "e3e810be-b9e6-41ce-af9d-e9ca45c7f429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.882156 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7htd\" (UniqueName: \"kubernetes.io/projected/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-kube-api-access-p7htd\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.882200 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:40.882211 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e810be-b9e6-41ce-af9d-e9ca45c7f429-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:41.029067 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jkvj"] Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:41.036587 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jkvj"] Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:41.494970 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:41.495333 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:41 crc kubenswrapper[4823]: I1216 08:55:41.782404 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" path="/var/lib/kubelet/pods/e3e810be-b9e6-41ce-af9d-e9ca45c7f429/volumes" Dec 16 08:55:42 crc kubenswrapper[4823]: I1216 08:55:42.533646 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nwbdc" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="registry-server" probeResult="failure" output=< Dec 16 08:55:42 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 08:55:42 crc kubenswrapper[4823]: > Dec 16 08:55:45 crc kubenswrapper[4823]: I1216 08:55:45.772118 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:55:45 crc kubenswrapper[4823]: E1216 08:55:45.773740 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:55:51 crc kubenswrapper[4823]: I1216 08:55:51.574708 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:51 crc kubenswrapper[4823]: I1216 08:55:51.668508 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:51 crc kubenswrapper[4823]: I1216 08:55:51.821578 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwbdc"] Dec 16 08:55:52 crc kubenswrapper[4823]: I1216 08:55:52.799923 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nwbdc" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="registry-server" containerID="cri-o://97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede" gracePeriod=2 Dec 16 08:55:52 crc kubenswrapper[4823]: E1216 08:55:52.958643 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2efa33c_f9b1_44d2_9925_5417a4391430.slice/crio-97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede.scope\": RecentStats: unable to find data in memory cache]" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.270204 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.349163 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6477q\" (UniqueName: \"kubernetes.io/projected/b2efa33c-f9b1-44d2-9925-5417a4391430-kube-api-access-6477q\") pod \"b2efa33c-f9b1-44d2-9925-5417a4391430\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.349241 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-catalog-content\") pod \"b2efa33c-f9b1-44d2-9925-5417a4391430\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.349260 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-utilities\") pod \"b2efa33c-f9b1-44d2-9925-5417a4391430\" (UID: \"b2efa33c-f9b1-44d2-9925-5417a4391430\") " Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.350413 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-utilities" (OuterVolumeSpecName: "utilities") pod "b2efa33c-f9b1-44d2-9925-5417a4391430" (UID: "b2efa33c-f9b1-44d2-9925-5417a4391430"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.357537 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2efa33c-f9b1-44d2-9925-5417a4391430-kube-api-access-6477q" (OuterVolumeSpecName: "kube-api-access-6477q") pod "b2efa33c-f9b1-44d2-9925-5417a4391430" (UID: "b2efa33c-f9b1-44d2-9925-5417a4391430"). InnerVolumeSpecName "kube-api-access-6477q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.450661 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.450871 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6477q\" (UniqueName: \"kubernetes.io/projected/b2efa33c-f9b1-44d2-9925-5417a4391430-kube-api-access-6477q\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.473277 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2efa33c-f9b1-44d2-9925-5417a4391430" (UID: "b2efa33c-f9b1-44d2-9925-5417a4391430"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.552992 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2efa33c-f9b1-44d2-9925-5417a4391430-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.808803 4823 generic.go:334] "Generic (PLEG): container finished" podID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerID="97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede" exitCode=0 Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.808864 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerDied","Data":"97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede"} Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.808905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwbdc" event={"ID":"b2efa33c-f9b1-44d2-9925-5417a4391430","Type":"ContainerDied","Data":"de9a6e5be7f19930d110db13f092b21ab75a90059c8fa637d19a61c772b7ad17"} Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.808928 4823 scope.go:117] "RemoveContainer" containerID="97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.810539 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwbdc" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.839552 4823 scope.go:117] "RemoveContainer" containerID="f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.849395 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwbdc"] Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.858890 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nwbdc"] Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.859743 4823 scope.go:117] "RemoveContainer" containerID="8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.893881 4823 scope.go:117] "RemoveContainer" containerID="97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede" Dec 16 08:55:53 crc kubenswrapper[4823]: E1216 08:55:53.894623 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede\": container with ID starting with 97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede not found: ID does not exist" containerID="97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.894674 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede"} err="failed to get container status \"97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede\": rpc error: code = NotFound desc = could not find container \"97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede\": container with ID starting with 97a8e9ff643bd93e9e109391ecf7ebf132a57b077a8bc22026ae555867d22ede not found: ID does not exist" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.894704 4823 scope.go:117] "RemoveContainer" containerID="f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7" Dec 16 08:55:53 crc kubenswrapper[4823]: E1216 08:55:53.895144 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7\": container with ID starting with f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7 not found: ID does not exist" containerID="f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.895198 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7"} err="failed to get container status \"f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7\": rpc error: code = NotFound desc = could not find container \"f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7\": container with ID starting with f93d5d9c1cf02a0db8e6550c377e282c4daf2a1536dfb7f591618b44aa9080f7 not found: ID does not exist" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.895229 4823 scope.go:117] "RemoveContainer" containerID="8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759" Dec 16 08:55:53 crc kubenswrapper[4823]: E1216 08:55:53.895626 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759\": container with ID starting with 8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759 not found: ID does not exist" containerID="8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759" Dec 16 08:55:53 crc kubenswrapper[4823]: I1216 08:55:53.895658 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759"} err="failed to get container status \"8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759\": rpc error: code = NotFound desc = could not find container \"8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759\": container with ID starting with 8903d087787358356731b57ac754ef4eb82740d4584c402d49de18d99e5ba759 not found: ID does not exist" Dec 16 08:55:55 crc kubenswrapper[4823]: I1216 08:55:55.780943 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" path="/var/lib/kubelet/pods/b2efa33c-f9b1-44d2-9925-5417a4391430/volumes" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.741595 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-4bvq6"] Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742310 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="registry-server" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742328 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="registry-server" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742347 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="extract-content" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742355 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="extract-content" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742365 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="extract-utilities" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742373 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="extract-utilities" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742392 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="extract-utilities" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742400 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="extract-utilities" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742432 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="extract-content" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742440 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="extract-content" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742448 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-httpd" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742455 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-httpd" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742470 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="registry-server" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742477 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="registry-server" Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.742489 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-api" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742497 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-api" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742709 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-httpd" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742734 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2efa33c-f9b1-44d2-9925-5417a4391430" containerName="registry-server" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742745 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a34646-6156-46b7-be55-3b645e543cb0" containerName="neutron-api" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.742759 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e810be-b9e6-41ce-af9d-e9ca45c7f429" containerName="registry-server" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.744705 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.747208 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5ph8n" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.747227 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.747248 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.747264 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.747230 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.787261 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4bvq6"] Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.795080 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-4bvq6"] Dec 16 08:55:57 crc kubenswrapper[4823]: E1216 08:55:57.795745 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-n5pjf ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-n5pjf ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-4bvq6" podUID="b15b075d-ecdf-431b-aa6d-cfa09258d2e5" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.842498 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.858868 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.865604 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7785588999-7gvll"] Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.867277 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.880725 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7785588999-7gvll"] Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938160 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-scripts\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938243 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-dispersionconf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938326 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-ring-data-devices\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938372 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-swiftconf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938443 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-combined-ca-bundle\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938481 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-etc-swift\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:57 crc kubenswrapper[4823]: I1216 08:55:57.938524 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5pjf\" (UniqueName: \"kubernetes.io/projected/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-kube-api-access-n5pjf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.040836 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-sb\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.040906 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-dispersionconf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.040965 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r27c\" (UniqueName: \"kubernetes.io/projected/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-kube-api-access-4r27c\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041013 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-nb\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041269 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-ring-data-devices\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041369 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-swiftconf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041436 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-combined-ca-bundle\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041470 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-etc-swift\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041500 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-config\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041533 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-dns-svc\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041565 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5pjf\" (UniqueName: \"kubernetes.io/projected/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-kube-api-access-n5pjf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.041604 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-scripts\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.042483 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-ring-data-devices\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.042825 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-etc-swift\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.051628 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-swiftconf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.054483 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-combined-ca-bundle\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.067000 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5pjf\" (UniqueName: \"kubernetes.io/projected/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-kube-api-access-n5pjf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.053413 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-dispersionconf\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.090783 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-scripts\") pod \"swift-ring-rebalance-4bvq6\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.143159 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-etc-swift\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.143352 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-ring-data-devices\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.143556 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.143824 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-config\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.143865 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-dns-svc\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.143817 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144097 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-sb\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144180 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r27c\" (UniqueName: \"kubernetes.io/projected/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-kube-api-access-4r27c\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144243 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-nb\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144299 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144317 4823 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144942 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-sb\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144973 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-dns-svc\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.144999 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-config\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.145063 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-nb\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.178395 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r27c\" (UniqueName: \"kubernetes.io/projected/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-kube-api-access-4r27c\") pod \"dnsmasq-dns-7785588999-7gvll\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.189785 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.252221 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-dispersionconf\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.252273 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-scripts\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.252300 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5pjf\" (UniqueName: \"kubernetes.io/projected/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-kube-api-access-n5pjf\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.252345 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-swiftconf\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.252378 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-combined-ca-bundle\") pod \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\" (UID: \"b15b075d-ecdf-431b-aa6d-cfa09258d2e5\") " Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.255553 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-scripts" (OuterVolumeSpecName: "scripts") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.258124 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.258273 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-kube-api-access-n5pjf" (OuterVolumeSpecName: "kube-api-access-n5pjf") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "kube-api-access-n5pjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.261257 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.266319 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b15b075d-ecdf-431b-aa6d-cfa09258d2e5" (UID: "b15b075d-ecdf-431b-aa6d-cfa09258d2e5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.366950 4823 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.366985 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.367000 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5pjf\" (UniqueName: \"kubernetes.io/projected/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-kube-api-access-n5pjf\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.367017 4823 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.367044 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b075d-ecdf-431b-aa6d-cfa09258d2e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.730629 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7785588999-7gvll"] Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.902186 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4bvq6" Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.903545 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7785588999-7gvll" event={"ID":"3a9974ca-c280-4c37-9ca8-1f70128d8ea0","Type":"ContainerStarted","Data":"71ac8a177178141e90ccd769af19825d487e308f6aa32117a9c04b942152cbe1"} Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.987826 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-4bvq6"] Dec 16 08:55:58 crc kubenswrapper[4823]: I1216 08:55:58.994918 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-4bvq6"] Dec 16 08:55:59 crc kubenswrapper[4823]: I1216 08:55:59.771279 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:55:59 crc kubenswrapper[4823]: E1216 08:55:59.771832 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:55:59 crc kubenswrapper[4823]: I1216 08:55:59.786283 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15b075d-ecdf-431b-aa6d-cfa09258d2e5" path="/var/lib/kubelet/pods/b15b075d-ecdf-431b-aa6d-cfa09258d2e5/volumes" Dec 16 08:55:59 crc kubenswrapper[4823]: I1216 08:55:59.911088 4823 generic.go:334] "Generic (PLEG): container finished" podID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerID="4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f" exitCode=0 Dec 16 08:55:59 crc kubenswrapper[4823]: I1216 08:55:59.911134 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7785588999-7gvll" event={"ID":"3a9974ca-c280-4c37-9ca8-1f70128d8ea0","Type":"ContainerDied","Data":"4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f"} Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.238622 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-85f6cd6f48-dkjbs"] Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.243212 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.245894 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.246000 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.246152 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.246267 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5ph8n" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.259063 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85f6cd6f48-dkjbs"] Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.331211 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-etc-swift\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.331541 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-log-httpd\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.331592 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-combined-ca-bundle\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.331679 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-config-data\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.331729 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkd97\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-kube-api-access-jkd97\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.331775 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-run-httpd\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.432916 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-run-httpd\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.433045 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-etc-swift\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.433086 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-log-httpd\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.433144 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-combined-ca-bundle\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.433177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-config-data\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.433202 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkd97\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-kube-api-access-jkd97\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.433355 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-run-httpd\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.434156 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-log-httpd\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.437968 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-etc-swift\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.437971 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-combined-ca-bundle\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.438272 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-config-data\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.451563 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkd97\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-kube-api-access-jkd97\") pod \"swift-proxy-85f6cd6f48-dkjbs\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.568951 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.927781 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7785588999-7gvll" event={"ID":"3a9974ca-c280-4c37-9ca8-1f70128d8ea0","Type":"ContainerStarted","Data":"d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76"} Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.927947 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:56:00 crc kubenswrapper[4823]: I1216 08:56:00.954851 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7785588999-7gvll" podStartSLOduration=3.954827941 podStartE2EDuration="3.954827941s" podCreationTimestamp="2025-12-16 08:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:56:00.947352827 +0000 UTC m=+7239.435918950" watchObservedRunningTime="2025-12-16 08:56:00.954827941 +0000 UTC m=+7239.443394054" Dec 16 08:56:01 crc kubenswrapper[4823]: W1216 08:56:01.299881 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod093a7b05_310d_4b00_9c92_eee4fd6cacf1.slice/crio-6a61db8cc7b33ce005a8b8191c337f93c2254dcfba2bdf88a2c6ab305764020f WatchSource:0}: Error finding container 6a61db8cc7b33ce005a8b8191c337f93c2254dcfba2bdf88a2c6ab305764020f: Status 404 returned error can't find the container with id 6a61db8cc7b33ce005a8b8191c337f93c2254dcfba2bdf88a2c6ab305764020f Dec 16 08:56:01 crc kubenswrapper[4823]: I1216 08:56:01.301208 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85f6cd6f48-dkjbs"] Dec 16 08:56:01 crc kubenswrapper[4823]: I1216 08:56:01.302289 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:56:01 crc kubenswrapper[4823]: I1216 08:56:01.945221 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" event={"ID":"093a7b05-310d-4b00-9c92-eee4fd6cacf1","Type":"ContainerStarted","Data":"6a61db8cc7b33ce005a8b8191c337f93c2254dcfba2bdf88a2c6ab305764020f"} Dec 16 08:56:02 crc kubenswrapper[4823]: I1216 08:56:02.973159 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5bbf6c4b7b-7qpq6"] Dec 16 08:56:02 crc kubenswrapper[4823]: I1216 08:56:02.975060 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:02 crc kubenswrapper[4823]: I1216 08:56:02.977932 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 16 08:56:02 crc kubenswrapper[4823]: I1216 08:56:02.978227 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.006374 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bbf6c4b7b-7qpq6"] Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083477 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-etc-swift\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083633 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-internal-tls-certs\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083659 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mqn5\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-kube-api-access-8mqn5\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083676 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-public-tls-certs\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083708 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-config-data\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083888 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-run-httpd\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.083989 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-combined-ca-bundle\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.084069 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-log-httpd\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.185900 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mqn5\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-kube-api-access-8mqn5\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186268 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-public-tls-certs\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186341 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-config-data\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186395 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-run-httpd\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186429 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-combined-ca-bundle\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186478 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-log-httpd\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186582 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-etc-swift\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186692 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-internal-tls-certs\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.186971 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-run-httpd\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.187340 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-log-httpd\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.192763 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-internal-tls-certs\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.192858 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-config-data\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.192856 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-etc-swift\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.202936 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-combined-ca-bundle\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.203579 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mqn5\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-kube-api-access-8mqn5\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.203828 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-public-tls-certs\") pod \"swift-proxy-5bbf6c4b7b-7qpq6\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:03 crc kubenswrapper[4823]: I1216 08:56:03.306748 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:05 crc kubenswrapper[4823]: I1216 08:56:05.378963 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bbf6c4b7b-7qpq6"] Dec 16 08:56:05 crc kubenswrapper[4823]: W1216 08:56:05.383495 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d676c2b_8cf1_4933_8f2b_641733d096fc.slice/crio-29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782 WatchSource:0}: Error finding container 29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782: Status 404 returned error can't find the container with id 29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782 Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.001159 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" event={"ID":"093a7b05-310d-4b00-9c92-eee4fd6cacf1","Type":"ContainerStarted","Data":"23873dcfdd6b3fa1220ed8fbf104be3c701c6ee001182c857a516811f76ef156"} Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.001851 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.001870 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" event={"ID":"093a7b05-310d-4b00-9c92-eee4fd6cacf1","Type":"ContainerStarted","Data":"722089b91b5cd3125a0e717032170c55163a011a7e1d78f1f4be2cc5e8caa8e0"} Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.001886 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.004882 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" event={"ID":"2d676c2b-8cf1-4933-8f2b-641733d096fc","Type":"ContainerStarted","Data":"57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392"} Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.004929 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" event={"ID":"2d676c2b-8cf1-4933-8f2b-641733d096fc","Type":"ContainerStarted","Data":"d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a"} Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.004948 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" event={"ID":"2d676c2b-8cf1-4933-8f2b-641733d096fc","Type":"ContainerStarted","Data":"29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782"} Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.005068 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.051943 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" podStartSLOduration=4.051915075 podStartE2EDuration="4.051915075s" podCreationTimestamp="2025-12-16 08:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:56:06.046700132 +0000 UTC m=+7244.535266275" watchObservedRunningTime="2025-12-16 08:56:06.051915075 +0000 UTC m=+7244.540481208" Dec 16 08:56:06 crc kubenswrapper[4823]: I1216 08:56:06.054137 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" podStartSLOduration=2.328157352 podStartE2EDuration="6.054116153s" podCreationTimestamp="2025-12-16 08:56:00 +0000 UTC" firstStartedPulling="2025-12-16 08:56:01.302054341 +0000 UTC m=+7239.790620464" lastFinishedPulling="2025-12-16 08:56:05.028013122 +0000 UTC m=+7243.516579265" observedRunningTime="2025-12-16 08:56:06.025827518 +0000 UTC m=+7244.514393641" watchObservedRunningTime="2025-12-16 08:56:06.054116153 +0000 UTC m=+7244.542682286" Dec 16 08:56:07 crc kubenswrapper[4823]: I1216 08:56:07.011759 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.192846 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.264654 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649fcfbf79-f4wz7"] Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.265262 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" podUID="9368a023-0217-4537-9520-90b6b092997c" containerName="dnsmasq-dns" containerID="cri-o://bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9" gracePeriod=10 Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.773818 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.821845 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-sb\") pod \"9368a023-0217-4537-9520-90b6b092997c\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.821902 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmc46\" (UniqueName: \"kubernetes.io/projected/9368a023-0217-4537-9520-90b6b092997c-kube-api-access-dmc46\") pod \"9368a023-0217-4537-9520-90b6b092997c\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.822003 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-config\") pod \"9368a023-0217-4537-9520-90b6b092997c\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.822254 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-nb\") pod \"9368a023-0217-4537-9520-90b6b092997c\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.822505 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-dns-svc\") pod \"9368a023-0217-4537-9520-90b6b092997c\" (UID: \"9368a023-0217-4537-9520-90b6b092997c\") " Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.831911 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9368a023-0217-4537-9520-90b6b092997c-kube-api-access-dmc46" (OuterVolumeSpecName: "kube-api-access-dmc46") pod "9368a023-0217-4537-9520-90b6b092997c" (UID: "9368a023-0217-4537-9520-90b6b092997c"). InnerVolumeSpecName "kube-api-access-dmc46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.890932 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9368a023-0217-4537-9520-90b6b092997c" (UID: "9368a023-0217-4537-9520-90b6b092997c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.909544 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9368a023-0217-4537-9520-90b6b092997c" (UID: "9368a023-0217-4537-9520-90b6b092997c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.909628 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9368a023-0217-4537-9520-90b6b092997c" (UID: "9368a023-0217-4537-9520-90b6b092997c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.924202 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-config" (OuterVolumeSpecName: "config") pod "9368a023-0217-4537-9520-90b6b092997c" (UID: "9368a023-0217-4537-9520-90b6b092997c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.924738 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.924769 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.924778 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.924789 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmc46\" (UniqueName: \"kubernetes.io/projected/9368a023-0217-4537-9520-90b6b092997c-kube-api-access-dmc46\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:08 crc kubenswrapper[4823]: I1216 08:56:08.924800 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9368a023-0217-4537-9520-90b6b092997c-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.028839 4823 generic.go:334] "Generic (PLEG): container finished" podID="9368a023-0217-4537-9520-90b6b092997c" containerID="bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9" exitCode=0 Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.028885 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" event={"ID":"9368a023-0217-4537-9520-90b6b092997c","Type":"ContainerDied","Data":"bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9"} Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.028912 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" event={"ID":"9368a023-0217-4537-9520-90b6b092997c","Type":"ContainerDied","Data":"a37500baf946182e6c82d510dd86dcb37bd694f213f06dad9d1d6d370eb1695f"} Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.028922 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649fcfbf79-f4wz7" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.028928 4823 scope.go:117] "RemoveContainer" containerID="bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.049783 4823 scope.go:117] "RemoveContainer" containerID="1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.058940 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649fcfbf79-f4wz7"] Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.065172 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-649fcfbf79-f4wz7"] Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.082215 4823 scope.go:117] "RemoveContainer" containerID="bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9" Dec 16 08:56:09 crc kubenswrapper[4823]: E1216 08:56:09.082657 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9\": container with ID starting with bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9 not found: ID does not exist" containerID="bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.082694 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9"} err="failed to get container status \"bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9\": rpc error: code = NotFound desc = could not find container \"bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9\": container with ID starting with bac836dce097692d08e3d6ab425fd0457c24b4a508f21c3d86eee76f4eda79c9 not found: ID does not exist" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.082723 4823 scope.go:117] "RemoveContainer" containerID="1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605" Dec 16 08:56:09 crc kubenswrapper[4823]: E1216 08:56:09.083248 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605\": container with ID starting with 1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605 not found: ID does not exist" containerID="1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.083277 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605"} err="failed to get container status \"1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605\": rpc error: code = NotFound desc = could not find container \"1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605\": container with ID starting with 1065e9229719b5b2371884dc0f665b54be0f3d80b2a25ff8b8c70fbd0f8de605 not found: ID does not exist" Dec 16 08:56:09 crc kubenswrapper[4823]: I1216 08:56:09.784768 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9368a023-0217-4537-9520-90b6b092997c" path="/var/lib/kubelet/pods/9368a023-0217-4537-9520-90b6b092997c/volumes" Dec 16 08:56:10 crc kubenswrapper[4823]: I1216 08:56:10.579303 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:10 crc kubenswrapper[4823]: I1216 08:56:10.580816 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:13 crc kubenswrapper[4823]: I1216 08:56:13.315141 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:13 crc kubenswrapper[4823]: I1216 08:56:13.320287 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 08:56:13 crc kubenswrapper[4823]: I1216 08:56:13.408567 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-85f6cd6f48-dkjbs"] Dec 16 08:56:13 crc kubenswrapper[4823]: I1216 08:56:13.408956 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-httpd" containerID="cri-o://722089b91b5cd3125a0e717032170c55163a011a7e1d78f1f4be2cc5e8caa8e0" gracePeriod=30 Dec 16 08:56:13 crc kubenswrapper[4823]: I1216 08:56:13.409072 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-server" containerID="cri-o://23873dcfdd6b3fa1220ed8fbf104be3c701c6ee001182c857a516811f76ef156" gracePeriod=30 Dec 16 08:56:13 crc kubenswrapper[4823]: I1216 08:56:13.771957 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:56:13 crc kubenswrapper[4823]: E1216 08:56:13.772467 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.078753 4823 generic.go:334] "Generic (PLEG): container finished" podID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerID="23873dcfdd6b3fa1220ed8fbf104be3c701c6ee001182c857a516811f76ef156" exitCode=0 Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.078790 4823 generic.go:334] "Generic (PLEG): container finished" podID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerID="722089b91b5cd3125a0e717032170c55163a011a7e1d78f1f4be2cc5e8caa8e0" exitCode=0 Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.078895 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" event={"ID":"093a7b05-310d-4b00-9c92-eee4fd6cacf1","Type":"ContainerDied","Data":"23873dcfdd6b3fa1220ed8fbf104be3c701c6ee001182c857a516811f76ef156"} Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.078978 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" event={"ID":"093a7b05-310d-4b00-9c92-eee4fd6cacf1","Type":"ContainerDied","Data":"722089b91b5cd3125a0e717032170c55163a011a7e1d78f1f4be2cc5e8caa8e0"} Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.672189 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.774468 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-config-data\") pod \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.774550 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-run-httpd\") pod \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.774599 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-combined-ca-bundle\") pod \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.774736 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkd97\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-kube-api-access-jkd97\") pod \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.774804 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-etc-swift\") pod \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.774886 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-log-httpd\") pod \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\" (UID: \"093a7b05-310d-4b00-9c92-eee4fd6cacf1\") " Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.775008 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "093a7b05-310d-4b00-9c92-eee4fd6cacf1" (UID: "093a7b05-310d-4b00-9c92-eee4fd6cacf1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.775244 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.775605 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "093a7b05-310d-4b00-9c92-eee4fd6cacf1" (UID: "093a7b05-310d-4b00-9c92-eee4fd6cacf1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.781203 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-kube-api-access-jkd97" (OuterVolumeSpecName: "kube-api-access-jkd97") pod "093a7b05-310d-4b00-9c92-eee4fd6cacf1" (UID: "093a7b05-310d-4b00-9c92-eee4fd6cacf1"). InnerVolumeSpecName "kube-api-access-jkd97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.783187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "093a7b05-310d-4b00-9c92-eee4fd6cacf1" (UID: "093a7b05-310d-4b00-9c92-eee4fd6cacf1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.817705 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-config-data" (OuterVolumeSpecName: "config-data") pod "093a7b05-310d-4b00-9c92-eee4fd6cacf1" (UID: "093a7b05-310d-4b00-9c92-eee4fd6cacf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.825849 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093a7b05-310d-4b00-9c92-eee4fd6cacf1" (UID: "093a7b05-310d-4b00-9c92-eee4fd6cacf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.876714 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.876759 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkd97\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-kube-api-access-jkd97\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.876774 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/093a7b05-310d-4b00-9c92-eee4fd6cacf1-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.876784 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093a7b05-310d-4b00-9c92-eee4fd6cacf1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:14 crc kubenswrapper[4823]: I1216 08:56:14.876795 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093a7b05-310d-4b00-9c92-eee4fd6cacf1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.087357 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" event={"ID":"093a7b05-310d-4b00-9c92-eee4fd6cacf1","Type":"ContainerDied","Data":"6a61db8cc7b33ce005a8b8191c337f93c2254dcfba2bdf88a2c6ab305764020f"} Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.087407 4823 scope.go:117] "RemoveContainer" containerID="23873dcfdd6b3fa1220ed8fbf104be3c701c6ee001182c857a516811f76ef156" Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.087513 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85f6cd6f48-dkjbs" Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.125241 4823 scope.go:117] "RemoveContainer" containerID="722089b91b5cd3125a0e717032170c55163a011a7e1d78f1f4be2cc5e8caa8e0" Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.134053 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-85f6cd6f48-dkjbs"] Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.143516 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-85f6cd6f48-dkjbs"] Dec 16 08:56:15 crc kubenswrapper[4823]: I1216 08:56:15.780423 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" path="/var/lib/kubelet/pods/093a7b05-310d-4b00-9c92-eee4fd6cacf1/volumes" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.756732 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-82cl4"] Dec 16 08:56:19 crc kubenswrapper[4823]: E1216 08:56:19.757499 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-httpd" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757517 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-httpd" Dec 16 08:56:19 crc kubenswrapper[4823]: E1216 08:56:19.757537 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-server" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757545 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-server" Dec 16 08:56:19 crc kubenswrapper[4823]: E1216 08:56:19.757554 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9368a023-0217-4537-9520-90b6b092997c" containerName="init" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757562 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9368a023-0217-4537-9520-90b6b092997c" containerName="init" Dec 16 08:56:19 crc kubenswrapper[4823]: E1216 08:56:19.757577 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9368a023-0217-4537-9520-90b6b092997c" containerName="dnsmasq-dns" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757583 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9368a023-0217-4537-9520-90b6b092997c" containerName="dnsmasq-dns" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757797 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9368a023-0217-4537-9520-90b6b092997c" containerName="dnsmasq-dns" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757814 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-httpd" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.757839 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="093a7b05-310d-4b00-9c92-eee4fd6cacf1" containerName="proxy-server" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.758559 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.763926 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-82cl4"] Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.851582 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ea34-account-create-update-l6sms"] Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.853046 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.855845 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.860712 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flppd\" (UniqueName: \"kubernetes.io/projected/0bcd2929-eefc-4b29-829b-e565910486bb-kube-api-access-flppd\") pod \"cinder-db-create-82cl4\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.860797 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcd2929-eefc-4b29-829b-e565910486bb-operator-scripts\") pod \"cinder-db-create-82cl4\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.861318 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ea34-account-create-update-l6sms"] Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.962103 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flppd\" (UniqueName: \"kubernetes.io/projected/0bcd2929-eefc-4b29-829b-e565910486bb-kube-api-access-flppd\") pod \"cinder-db-create-82cl4\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.962152 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lm6s\" (UniqueName: \"kubernetes.io/projected/4776592c-7509-4140-b012-6e506b95806d-kube-api-access-5lm6s\") pod \"cinder-ea34-account-create-update-l6sms\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.962220 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcd2929-eefc-4b29-829b-e565910486bb-operator-scripts\") pod \"cinder-db-create-82cl4\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.962253 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4776592c-7509-4140-b012-6e506b95806d-operator-scripts\") pod \"cinder-ea34-account-create-update-l6sms\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.963007 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcd2929-eefc-4b29-829b-e565910486bb-operator-scripts\") pod \"cinder-db-create-82cl4\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:19 crc kubenswrapper[4823]: I1216 08:56:19.980266 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flppd\" (UniqueName: \"kubernetes.io/projected/0bcd2929-eefc-4b29-829b-e565910486bb-kube-api-access-flppd\") pod \"cinder-db-create-82cl4\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.064247 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lm6s\" (UniqueName: \"kubernetes.io/projected/4776592c-7509-4140-b012-6e506b95806d-kube-api-access-5lm6s\") pod \"cinder-ea34-account-create-update-l6sms\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.064374 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4776592c-7509-4140-b012-6e506b95806d-operator-scripts\") pod \"cinder-ea34-account-create-update-l6sms\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.065317 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4776592c-7509-4140-b012-6e506b95806d-operator-scripts\") pod \"cinder-ea34-account-create-update-l6sms\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.082068 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.083746 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lm6s\" (UniqueName: \"kubernetes.io/projected/4776592c-7509-4140-b012-6e506b95806d-kube-api-access-5lm6s\") pod \"cinder-ea34-account-create-update-l6sms\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.170133 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.527700 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-82cl4"] Dec 16 08:56:20 crc kubenswrapper[4823]: W1216 08:56:20.531206 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bcd2929_eefc_4b29_829b_e565910486bb.slice/crio-cea756a2409edcec0bc924fa626169842943f6c7fbe6ae352e9769a73c7de923 WatchSource:0}: Error finding container cea756a2409edcec0bc924fa626169842943f6c7fbe6ae352e9769a73c7de923: Status 404 returned error can't find the container with id cea756a2409edcec0bc924fa626169842943f6c7fbe6ae352e9769a73c7de923 Dec 16 08:56:20 crc kubenswrapper[4823]: I1216 08:56:20.637652 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ea34-account-create-update-l6sms"] Dec 16 08:56:21 crc kubenswrapper[4823]: I1216 08:56:21.154211 4823 generic.go:334] "Generic (PLEG): container finished" podID="4776592c-7509-4140-b012-6e506b95806d" containerID="c94a186f9ff0a8616322e194208655bbfeccfa8661b4b7871a9b562424b73faa" exitCode=0 Dec 16 08:56:21 crc kubenswrapper[4823]: I1216 08:56:21.154451 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea34-account-create-update-l6sms" event={"ID":"4776592c-7509-4140-b012-6e506b95806d","Type":"ContainerDied","Data":"c94a186f9ff0a8616322e194208655bbfeccfa8661b4b7871a9b562424b73faa"} Dec 16 08:56:21 crc kubenswrapper[4823]: I1216 08:56:21.154682 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea34-account-create-update-l6sms" event={"ID":"4776592c-7509-4140-b012-6e506b95806d","Type":"ContainerStarted","Data":"9923ffe90c215f89ce55e6e43868c4fd47fdfe3d57184a63f9fca8dc415b4322"} Dec 16 08:56:21 crc kubenswrapper[4823]: I1216 08:56:21.159001 4823 generic.go:334] "Generic (PLEG): container finished" podID="0bcd2929-eefc-4b29-829b-e565910486bb" containerID="4fa8fdd48414303b01bcd9b870436b73c85f7817ac4a6e64bc68d5926bad3e05" exitCode=0 Dec 16 08:56:21 crc kubenswrapper[4823]: I1216 08:56:21.159119 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-82cl4" event={"ID":"0bcd2929-eefc-4b29-829b-e565910486bb","Type":"ContainerDied","Data":"4fa8fdd48414303b01bcd9b870436b73c85f7817ac4a6e64bc68d5926bad3e05"} Dec 16 08:56:21 crc kubenswrapper[4823]: I1216 08:56:21.159173 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-82cl4" event={"ID":"0bcd2929-eefc-4b29-829b-e565910486bb","Type":"ContainerStarted","Data":"cea756a2409edcec0bc924fa626169842943f6c7fbe6ae352e9769a73c7de923"} Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.621200 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.630782 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.715771 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4776592c-7509-4140-b012-6e506b95806d-operator-scripts\") pod \"4776592c-7509-4140-b012-6e506b95806d\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.715897 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flppd\" (UniqueName: \"kubernetes.io/projected/0bcd2929-eefc-4b29-829b-e565910486bb-kube-api-access-flppd\") pod \"0bcd2929-eefc-4b29-829b-e565910486bb\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.715952 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lm6s\" (UniqueName: \"kubernetes.io/projected/4776592c-7509-4140-b012-6e506b95806d-kube-api-access-5lm6s\") pod \"4776592c-7509-4140-b012-6e506b95806d\" (UID: \"4776592c-7509-4140-b012-6e506b95806d\") " Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.716242 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4776592c-7509-4140-b012-6e506b95806d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4776592c-7509-4140-b012-6e506b95806d" (UID: "4776592c-7509-4140-b012-6e506b95806d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.716546 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcd2929-eefc-4b29-829b-e565910486bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bcd2929-eefc-4b29-829b-e565910486bb" (UID: "0bcd2929-eefc-4b29-829b-e565910486bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.716673 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcd2929-eefc-4b29-829b-e565910486bb-operator-scripts\") pod \"0bcd2929-eefc-4b29-829b-e565910486bb\" (UID: \"0bcd2929-eefc-4b29-829b-e565910486bb\") " Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.717162 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcd2929-eefc-4b29-829b-e565910486bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.717184 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4776592c-7509-4140-b012-6e506b95806d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.722721 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4776592c-7509-4140-b012-6e506b95806d-kube-api-access-5lm6s" (OuterVolumeSpecName: "kube-api-access-5lm6s") pod "4776592c-7509-4140-b012-6e506b95806d" (UID: "4776592c-7509-4140-b012-6e506b95806d"). InnerVolumeSpecName "kube-api-access-5lm6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.723241 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcd2929-eefc-4b29-829b-e565910486bb-kube-api-access-flppd" (OuterVolumeSpecName: "kube-api-access-flppd") pod "0bcd2929-eefc-4b29-829b-e565910486bb" (UID: "0bcd2929-eefc-4b29-829b-e565910486bb"). InnerVolumeSpecName "kube-api-access-flppd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.818800 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flppd\" (UniqueName: \"kubernetes.io/projected/0bcd2929-eefc-4b29-829b-e565910486bb-kube-api-access-flppd\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:22 crc kubenswrapper[4823]: I1216 08:56:22.818836 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lm6s\" (UniqueName: \"kubernetes.io/projected/4776592c-7509-4140-b012-6e506b95806d-kube-api-access-5lm6s\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:23 crc kubenswrapper[4823]: I1216 08:56:23.179715 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea34-account-create-update-l6sms" Dec 16 08:56:23 crc kubenswrapper[4823]: I1216 08:56:23.179732 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea34-account-create-update-l6sms" event={"ID":"4776592c-7509-4140-b012-6e506b95806d","Type":"ContainerDied","Data":"9923ffe90c215f89ce55e6e43868c4fd47fdfe3d57184a63f9fca8dc415b4322"} Dec 16 08:56:23 crc kubenswrapper[4823]: I1216 08:56:23.179789 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9923ffe90c215f89ce55e6e43868c4fd47fdfe3d57184a63f9fca8dc415b4322" Dec 16 08:56:23 crc kubenswrapper[4823]: I1216 08:56:23.181657 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-82cl4" event={"ID":"0bcd2929-eefc-4b29-829b-e565910486bb","Type":"ContainerDied","Data":"cea756a2409edcec0bc924fa626169842943f6c7fbe6ae352e9769a73c7de923"} Dec 16 08:56:23 crc kubenswrapper[4823]: I1216 08:56:23.181705 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea756a2409edcec0bc924fa626169842943f6c7fbe6ae352e9769a73c7de923" Dec 16 08:56:23 crc kubenswrapper[4823]: I1216 08:56:23.181719 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-82cl4" Dec 16 08:56:24 crc kubenswrapper[4823]: I1216 08:56:24.771611 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:56:24 crc kubenswrapper[4823]: E1216 08:56:24.772556 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.158477 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fjw2z"] Dec 16 08:56:25 crc kubenswrapper[4823]: E1216 08:56:25.158881 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4776592c-7509-4140-b012-6e506b95806d" containerName="mariadb-account-create-update" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.158904 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4776592c-7509-4140-b012-6e506b95806d" containerName="mariadb-account-create-update" Dec 16 08:56:25 crc kubenswrapper[4823]: E1216 08:56:25.158930 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bcd2929-eefc-4b29-829b-e565910486bb" containerName="mariadb-database-create" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.158939 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcd2929-eefc-4b29-829b-e565910486bb" containerName="mariadb-database-create" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.159140 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bcd2929-eefc-4b29-829b-e565910486bb" containerName="mariadb-database-create" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.159176 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4776592c-7509-4140-b012-6e506b95806d" containerName="mariadb-account-create-update" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.159869 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.165473 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rmcw7" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.165634 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.166270 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.185202 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fjw2z"] Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.270330 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-scripts\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.270410 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-db-sync-config-data\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.270489 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6p89\" (UniqueName: \"kubernetes.io/projected/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-kube-api-access-b6p89\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.270517 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-config-data\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.270548 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-combined-ca-bundle\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.270612 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-etc-machine-id\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372216 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-scripts\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372285 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-db-sync-config-data\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372364 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6p89\" (UniqueName: \"kubernetes.io/projected/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-kube-api-access-b6p89\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372394 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-config-data\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372426 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-combined-ca-bundle\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372504 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-etc-machine-id\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.372619 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-etc-machine-id\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.378787 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-db-sync-config-data\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.379272 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-config-data\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.384559 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-scripts\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.384854 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-combined-ca-bundle\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.390341 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6p89\" (UniqueName: \"kubernetes.io/projected/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-kube-api-access-b6p89\") pod \"cinder-db-sync-fjw2z\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.482000 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:56:25 crc kubenswrapper[4823]: I1216 08:56:25.928972 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fjw2z"] Dec 16 08:56:25 crc kubenswrapper[4823]: W1216 08:56:25.938048 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07a89ba3_ef70_4a41_b6d4_47d8575ccbbb.slice/crio-b29333d3c79c5e900db0dde477ee3ba53692783cc11cf7284d3b25f5621aa85d WatchSource:0}: Error finding container b29333d3c79c5e900db0dde477ee3ba53692783cc11cf7284d3b25f5621aa85d: Status 404 returned error can't find the container with id b29333d3c79c5e900db0dde477ee3ba53692783cc11cf7284d3b25f5621aa85d Dec 16 08:56:26 crc kubenswrapper[4823]: I1216 08:56:26.212736 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjw2z" event={"ID":"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb","Type":"ContainerStarted","Data":"b29333d3c79c5e900db0dde477ee3ba53692783cc11cf7284d3b25f5621aa85d"} Dec 16 08:56:37 crc kubenswrapper[4823]: I1216 08:56:37.772174 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:56:37 crc kubenswrapper[4823]: E1216 08:56:37.772898 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:56:43 crc kubenswrapper[4823]: I1216 08:56:43.107450 4823 scope.go:117] "RemoveContainer" containerID="03659f99df8607265e3bc3e2c161dd22c3a561cec8494c87016a1df74b164e10" Dec 16 08:56:49 crc kubenswrapper[4823]: I1216 08:56:49.433200 4823 scope.go:117] "RemoveContainer" containerID="0249968b43bd996ba4c490b1b172d3964d514a2718cf78c3c4b9728773e2dc5f" Dec 16 08:56:49 crc kubenswrapper[4823]: E1216 08:56:49.463961 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:56:49 crc kubenswrapper[4823]: E1216 08:56:49.464092 4823 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:56:49 crc kubenswrapper[4823]: E1216 08:56:49.464311 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6p89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fjw2z_openstack(07a89ba3-ef70-4a41-b6d4-47d8575ccbbb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:56:49 crc kubenswrapper[4823]: E1216 08:56:49.465533 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fjw2z" podUID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" Dec 16 08:56:50 crc kubenswrapper[4823]: E1216 08:56:50.445648 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/cinder-db-sync-fjw2z" podUID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" Dec 16 08:56:52 crc kubenswrapper[4823]: I1216 08:56:52.772924 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:56:52 crc kubenswrapper[4823]: E1216 08:56:52.773479 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:57:05 crc kubenswrapper[4823]: I1216 08:57:05.771817 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:57:05 crc kubenswrapper[4823]: E1216 08:57:05.772457 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:57:06 crc kubenswrapper[4823]: I1216 08:57:06.604389 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjw2z" event={"ID":"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb","Type":"ContainerStarted","Data":"e700452af7238afdeafb9a420ed28e550e0314d8d8bb24657ea8760e0a2e188a"} Dec 16 08:57:06 crc kubenswrapper[4823]: I1216 08:57:06.630554 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fjw2z" podStartSLOduration=2.59179535 podStartE2EDuration="41.630524319s" podCreationTimestamp="2025-12-16 08:56:25 +0000 UTC" firstStartedPulling="2025-12-16 08:56:25.94034573 +0000 UTC m=+7264.428911853" lastFinishedPulling="2025-12-16 08:57:04.979074689 +0000 UTC m=+7303.467640822" observedRunningTime="2025-12-16 08:57:06.626638167 +0000 UTC m=+7305.115204330" watchObservedRunningTime="2025-12-16 08:57:06.630524319 +0000 UTC m=+7305.119090472" Dec 16 08:57:09 crc kubenswrapper[4823]: I1216 08:57:09.629403 4823 generic.go:334] "Generic (PLEG): container finished" podID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" containerID="e700452af7238afdeafb9a420ed28e550e0314d8d8bb24657ea8760e0a2e188a" exitCode=0 Dec 16 08:57:09 crc kubenswrapper[4823]: I1216 08:57:09.629525 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjw2z" event={"ID":"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb","Type":"ContainerDied","Data":"e700452af7238afdeafb9a420ed28e550e0314d8d8bb24657ea8760e0a2e188a"} Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.026956 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.094987 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-scripts\") pod \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.095293 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6p89\" (UniqueName: \"kubernetes.io/projected/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-kube-api-access-b6p89\") pod \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.095403 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-combined-ca-bundle\") pod \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.095445 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-config-data\") pod \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.095485 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-db-sync-config-data\") pod \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.095524 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-etc-machine-id\") pod \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\" (UID: \"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb\") " Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.096006 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" (UID: "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.096297 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.099981 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-scripts" (OuterVolumeSpecName: "scripts") pod "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" (UID: "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.100267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-kube-api-access-b6p89" (OuterVolumeSpecName: "kube-api-access-b6p89") pod "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" (UID: "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb"). InnerVolumeSpecName "kube-api-access-b6p89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.103138 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" (UID: "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.131268 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" (UID: "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.152174 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-config-data" (OuterVolumeSpecName: "config-data") pod "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" (UID: "07a89ba3-ef70-4a41-b6d4-47d8575ccbbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.198873 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6p89\" (UniqueName: \"kubernetes.io/projected/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-kube-api-access-b6p89\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.198940 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.198966 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.198991 4823 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.199016 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.662575 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fjw2z" event={"ID":"07a89ba3-ef70-4a41-b6d4-47d8575ccbbb","Type":"ContainerDied","Data":"b29333d3c79c5e900db0dde477ee3ba53692783cc11cf7284d3b25f5621aa85d"} Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.662622 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29333d3c79c5e900db0dde477ee3ba53692783cc11cf7284d3b25f5621aa85d" Dec 16 08:57:11 crc kubenswrapper[4823]: I1216 08:57:11.662693 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fjw2z" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.004531 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf66f947-6bpwj"] Dec 16 08:57:12 crc kubenswrapper[4823]: E1216 08:57:12.005341 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" containerName="cinder-db-sync" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.005367 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" containerName="cinder-db-sync" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.005979 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" containerName="cinder-db-sync" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.038429 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.067972 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf66f947-6bpwj"] Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.115867 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-config\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.115961 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-sb\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.116089 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-dns-svc\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.116104 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-nb\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.116129 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzjvc\" (UniqueName: \"kubernetes.io/projected/23e23966-cfec-4725-b1aa-d799892ffec8-kube-api-access-fzjvc\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.162192 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.164052 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.166846 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rmcw7" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.168182 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.168437 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.168572 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.170559 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221165 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-dns-svc\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221223 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-nb\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221258 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzjvc\" (UniqueName: \"kubernetes.io/projected/23e23966-cfec-4725-b1aa-d799892ffec8-kube-api-access-fzjvc\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221363 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-scripts\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221387 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3f6275-71bd-4923-bd4d-b1f102f6569f-logs\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221432 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-config\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221470 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221497 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwvc\" (UniqueName: \"kubernetes.io/projected/ad3f6275-71bd-4923-bd4d-b1f102f6569f-kube-api-access-wpwvc\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221549 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-sb\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221571 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221596 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.221627 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3f6275-71bd-4923-bd4d-b1f102f6569f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.222578 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-dns-svc\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.223254 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-nb\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.223866 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-sb\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.224226 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-config\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.245856 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzjvc\" (UniqueName: \"kubernetes.io/projected/23e23966-cfec-4725-b1aa-d799892ffec8-kube-api-access-fzjvc\") pod \"dnsmasq-dns-bf66f947-6bpwj\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.324773 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3f6275-71bd-4923-bd4d-b1f102f6569f-logs\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.324964 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325007 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwvc\" (UniqueName: \"kubernetes.io/projected/ad3f6275-71bd-4923-bd4d-b1f102f6569f-kube-api-access-wpwvc\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325101 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325123 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325153 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3f6275-71bd-4923-bd4d-b1f102f6569f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325382 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-scripts\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325520 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3f6275-71bd-4923-bd4d-b1f102f6569f-logs\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.325903 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3f6275-71bd-4923-bd4d-b1f102f6569f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.329131 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.329180 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.329450 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-scripts\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.330637 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.350835 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwvc\" (UniqueName: \"kubernetes.io/projected/ad3f6275-71bd-4923-bd4d-b1f102f6569f-kube-api-access-wpwvc\") pod \"cinder-api-0\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.377906 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.490353 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.854681 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf66f947-6bpwj"] Dec 16 08:57:12 crc kubenswrapper[4823]: I1216 08:57:12.975696 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:12 crc kubenswrapper[4823]: W1216 08:57:12.985163 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad3f6275_71bd_4923_bd4d_b1f102f6569f.slice/crio-32c306df1a30105a0533a4561ac60432eec1dee0615d71f0755117f74fb141d6 WatchSource:0}: Error finding container 32c306df1a30105a0533a4561ac60432eec1dee0615d71f0755117f74fb141d6: Status 404 returned error can't find the container with id 32c306df1a30105a0533a4561ac60432eec1dee0615d71f0755117f74fb141d6 Dec 16 08:57:13 crc kubenswrapper[4823]: I1216 08:57:13.677510 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad3f6275-71bd-4923-bd4d-b1f102f6569f","Type":"ContainerStarted","Data":"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22"} Dec 16 08:57:13 crc kubenswrapper[4823]: I1216 08:57:13.677902 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad3f6275-71bd-4923-bd4d-b1f102f6569f","Type":"ContainerStarted","Data":"32c306df1a30105a0533a4561ac60432eec1dee0615d71f0755117f74fb141d6"} Dec 16 08:57:13 crc kubenswrapper[4823]: I1216 08:57:13.678843 4823 generic.go:334] "Generic (PLEG): container finished" podID="23e23966-cfec-4725-b1aa-d799892ffec8" containerID="ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20" exitCode=0 Dec 16 08:57:13 crc kubenswrapper[4823]: I1216 08:57:13.678864 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" event={"ID":"23e23966-cfec-4725-b1aa-d799892ffec8","Type":"ContainerDied","Data":"ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20"} Dec 16 08:57:13 crc kubenswrapper[4823]: I1216 08:57:13.678880 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" event={"ID":"23e23966-cfec-4725-b1aa-d799892ffec8","Type":"ContainerStarted","Data":"eed53cf5b3a88f01275a2364ad1f3a12b4a67641c79fd11340f2e9afc124393e"} Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.550863 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.693439 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" event={"ID":"23e23966-cfec-4725-b1aa-d799892ffec8","Type":"ContainerStarted","Data":"eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846"} Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.693617 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.700051 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad3f6275-71bd-4923-bd4d-b1f102f6569f","Type":"ContainerStarted","Data":"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343"} Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.700227 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.713261 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" podStartSLOduration=3.7132409969999998 podStartE2EDuration="3.713240997s" podCreationTimestamp="2025-12-16 08:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:57:14.711141491 +0000 UTC m=+7313.199707634" watchObservedRunningTime="2025-12-16 08:57:14.713240997 +0000 UTC m=+7313.201807120" Dec 16 08:57:14 crc kubenswrapper[4823]: I1216 08:57:14.739076 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.739056895 podStartE2EDuration="2.739056895s" podCreationTimestamp="2025-12-16 08:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:57:14.737097744 +0000 UTC m=+7313.225663877" watchObservedRunningTime="2025-12-16 08:57:14.739056895 +0000 UTC m=+7313.227623028" Dec 16 08:57:16 crc kubenswrapper[4823]: I1216 08:57:16.277692 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:57:16 crc kubenswrapper[4823]: E1216 08:57:16.278176 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:57:16 crc kubenswrapper[4823]: I1216 08:57:16.304682 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api-log" containerID="cri-o://1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22" gracePeriod=30 Dec 16 08:57:16 crc kubenswrapper[4823]: I1216 08:57:16.304883 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api" containerID="cri-o://6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343" gracePeriod=30 Dec 16 08:57:16 crc kubenswrapper[4823]: I1216 08:57:16.997203 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.123607 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.123683 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpwvc\" (UniqueName: \"kubernetes.io/projected/ad3f6275-71bd-4923-bd4d-b1f102f6569f-kube-api-access-wpwvc\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.123750 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-combined-ca-bundle\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.123840 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data-custom\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.123949 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3f6275-71bd-4923-bd4d-b1f102f6569f-logs\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.123988 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3f6275-71bd-4923-bd4d-b1f102f6569f-etc-machine-id\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.124038 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-scripts\") pod \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\" (UID: \"ad3f6275-71bd-4923-bd4d-b1f102f6569f\") " Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.124138 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad3f6275-71bd-4923-bd4d-b1f102f6569f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.124361 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3f6275-71bd-4923-bd4d-b1f102f6569f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.124373 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3f6275-71bd-4923-bd4d-b1f102f6569f-logs" (OuterVolumeSpecName: "logs") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.128887 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.130172 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3f6275-71bd-4923-bd4d-b1f102f6569f-kube-api-access-wpwvc" (OuterVolumeSpecName: "kube-api-access-wpwvc") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "kube-api-access-wpwvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.130185 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-scripts" (OuterVolumeSpecName: "scripts") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.148450 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.178274 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data" (OuterVolumeSpecName: "config-data") pod "ad3f6275-71bd-4923-bd4d-b1f102f6569f" (UID: "ad3f6275-71bd-4923-bd4d-b1f102f6569f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.226398 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.226437 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.226453 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpwvc\" (UniqueName: \"kubernetes.io/projected/ad3f6275-71bd-4923-bd4d-b1f102f6569f-kube-api-access-wpwvc\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.226467 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.226479 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3f6275-71bd-4923-bd4d-b1f102f6569f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.226490 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad3f6275-71bd-4923-bd4d-b1f102f6569f-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.315820 4823 generic.go:334] "Generic (PLEG): container finished" podID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerID="6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343" exitCode=0 Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.316165 4823 generic.go:334] "Generic (PLEG): container finished" podID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerID="1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22" exitCode=143 Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.315907 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad3f6275-71bd-4923-bd4d-b1f102f6569f","Type":"ContainerDied","Data":"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343"} Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.315906 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.316209 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad3f6275-71bd-4923-bd4d-b1f102f6569f","Type":"ContainerDied","Data":"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22"} Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.316223 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ad3f6275-71bd-4923-bd4d-b1f102f6569f","Type":"ContainerDied","Data":"32c306df1a30105a0533a4561ac60432eec1dee0615d71f0755117f74fb141d6"} Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.316244 4823 scope.go:117] "RemoveContainer" containerID="6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.362684 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.369804 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.374919 4823 scope.go:117] "RemoveContainer" containerID="1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.376419 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:17 crc kubenswrapper[4823]: E1216 08:57:17.376765 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.376783 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api" Dec 16 08:57:17 crc kubenswrapper[4823]: E1216 08:57:17.376808 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api-log" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.376815 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api-log" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.376985 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.377079 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" containerName="cinder-api-log" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.378299 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.386887 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.425185 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.425451 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rmcw7" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.425579 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.425686 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.425824 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.426238 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.428993 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec64cc79-d75f-4f52-8ec4-22de2801736b-logs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429372 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429604 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429638 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429686 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429731 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6qs\" (UniqueName: \"kubernetes.io/projected/ec64cc79-d75f-4f52-8ec4-22de2801736b-kube-api-access-pv6qs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429760 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec64cc79-d75f-4f52-8ec4-22de2801736b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429807 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-scripts\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.429827 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data-custom\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.482788 4823 scope.go:117] "RemoveContainer" containerID="6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343" Dec 16 08:57:17 crc kubenswrapper[4823]: E1216 08:57:17.483464 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343\": container with ID starting with 6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343 not found: ID does not exist" containerID="6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.483517 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343"} err="failed to get container status \"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343\": rpc error: code = NotFound desc = could not find container \"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343\": container with ID starting with 6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343 not found: ID does not exist" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.483551 4823 scope.go:117] "RemoveContainer" containerID="1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22" Dec 16 08:57:17 crc kubenswrapper[4823]: E1216 08:57:17.484260 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22\": container with ID starting with 1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22 not found: ID does not exist" containerID="1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.484282 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22"} err="failed to get container status \"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22\": rpc error: code = NotFound desc = could not find container \"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22\": container with ID starting with 1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22 not found: ID does not exist" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.484303 4823 scope.go:117] "RemoveContainer" containerID="6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.484623 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343"} err="failed to get container status \"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343\": rpc error: code = NotFound desc = could not find container \"6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343\": container with ID starting with 6a4c0372e9d2da0c1828bbb60b99735133fefafef74d36b56240b8133d0eb343 not found: ID does not exist" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.484645 4823 scope.go:117] "RemoveContainer" containerID="1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.485165 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22"} err="failed to get container status \"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22\": rpc error: code = NotFound desc = could not find container \"1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22\": container with ID starting with 1b2b822b3bdbf16fc11848232ac0714e68bf564566dcb604a2c490d56710dc22 not found: ID does not exist" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531018 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-scripts\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531083 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data-custom\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531124 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec64cc79-d75f-4f52-8ec4-22de2801736b-logs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531185 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531262 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531283 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531326 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531365 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec64cc79-d75f-4f52-8ec4-22de2801736b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531387 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6qs\" (UniqueName: \"kubernetes.io/projected/ec64cc79-d75f-4f52-8ec4-22de2801736b-kube-api-access-pv6qs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531647 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec64cc79-d75f-4f52-8ec4-22de2801736b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.531886 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec64cc79-d75f-4f52-8ec4-22de2801736b-logs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.535326 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-scripts\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.535608 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.536386 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.537301 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.537502 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.538640 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data-custom\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.548769 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6qs\" (UniqueName: \"kubernetes.io/projected/ec64cc79-d75f-4f52-8ec4-22de2801736b-kube-api-access-pv6qs\") pod \"cinder-api-0\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.779602 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:17 crc kubenswrapper[4823]: I1216 08:57:17.782222 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3f6275-71bd-4923-bd4d-b1f102f6569f" path="/var/lib/kubelet/pods/ad3f6275-71bd-4923-bd4d-b1f102f6569f/volumes" Dec 16 08:57:18 crc kubenswrapper[4823]: I1216 08:57:18.279085 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:18 crc kubenswrapper[4823]: W1216 08:57:18.281884 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec64cc79_d75f_4f52_8ec4_22de2801736b.slice/crio-e1b4f64b72edb20892dc4152eae71b4419fb662959f16c94eb6283fce43e5f48 WatchSource:0}: Error finding container e1b4f64b72edb20892dc4152eae71b4419fb662959f16c94eb6283fce43e5f48: Status 404 returned error can't find the container with id e1b4f64b72edb20892dc4152eae71b4419fb662959f16c94eb6283fce43e5f48 Dec 16 08:57:18 crc kubenswrapper[4823]: I1216 08:57:18.328073 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ec64cc79-d75f-4f52-8ec4-22de2801736b","Type":"ContainerStarted","Data":"e1b4f64b72edb20892dc4152eae71b4419fb662959f16c94eb6283fce43e5f48"} Dec 16 08:57:19 crc kubenswrapper[4823]: I1216 08:57:19.340243 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ec64cc79-d75f-4f52-8ec4-22de2801736b","Type":"ContainerStarted","Data":"17dc5e4ad2ca29e77d42bc10be3131bbf5a9ee3e0ed5cf04677040d204bf384e"} Dec 16 08:57:20 crc kubenswrapper[4823]: I1216 08:57:20.349442 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ec64cc79-d75f-4f52-8ec4-22de2801736b","Type":"ContainerStarted","Data":"cf5839209453ac08804915e33ce6d089caafeddb0fc0615fadf3d8453a1a6749"} Dec 16 08:57:20 crc kubenswrapper[4823]: I1216 08:57:20.350602 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 08:57:20 crc kubenswrapper[4823]: I1216 08:57:20.381228 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.381208193 podStartE2EDuration="3.381208193s" podCreationTimestamp="2025-12-16 08:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:57:20.367621317 +0000 UTC m=+7318.856187450" watchObservedRunningTime="2025-12-16 08:57:20.381208193 +0000 UTC m=+7318.869774316" Dec 16 08:57:22 crc kubenswrapper[4823]: I1216 08:57:22.379220 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:57:22 crc kubenswrapper[4823]: I1216 08:57:22.436994 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7785588999-7gvll"] Dec 16 08:57:22 crc kubenswrapper[4823]: I1216 08:57:22.437421 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7785588999-7gvll" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerName="dnsmasq-dns" containerID="cri-o://d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76" gracePeriod=10 Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.078406 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.244162 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-dns-svc\") pod \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.244304 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r27c\" (UniqueName: \"kubernetes.io/projected/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-kube-api-access-4r27c\") pod \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.244369 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-nb\") pod \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.244459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-sb\") pod \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.244609 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-config\") pod \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\" (UID: \"3a9974ca-c280-4c37-9ca8-1f70128d8ea0\") " Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.251265 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-kube-api-access-4r27c" (OuterVolumeSpecName: "kube-api-access-4r27c") pod "3a9974ca-c280-4c37-9ca8-1f70128d8ea0" (UID: "3a9974ca-c280-4c37-9ca8-1f70128d8ea0"). InnerVolumeSpecName "kube-api-access-4r27c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.290147 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a9974ca-c280-4c37-9ca8-1f70128d8ea0" (UID: "3a9974ca-c280-4c37-9ca8-1f70128d8ea0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.292217 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a9974ca-c280-4c37-9ca8-1f70128d8ea0" (UID: "3a9974ca-c280-4c37-9ca8-1f70128d8ea0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.295317 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a9974ca-c280-4c37-9ca8-1f70128d8ea0" (UID: "3a9974ca-c280-4c37-9ca8-1f70128d8ea0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.296314 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-config" (OuterVolumeSpecName: "config") pod "3a9974ca-c280-4c37-9ca8-1f70128d8ea0" (UID: "3a9974ca-c280-4c37-9ca8-1f70128d8ea0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.346962 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.347002 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.347062 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r27c\" (UniqueName: \"kubernetes.io/projected/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-kube-api-access-4r27c\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.347076 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.347084 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a9974ca-c280-4c37-9ca8-1f70128d8ea0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.374841 4823 generic.go:334] "Generic (PLEG): container finished" podID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerID="d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76" exitCode=0 Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.374889 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7785588999-7gvll" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.374890 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7785588999-7gvll" event={"ID":"3a9974ca-c280-4c37-9ca8-1f70128d8ea0","Type":"ContainerDied","Data":"d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76"} Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.375014 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7785588999-7gvll" event={"ID":"3a9974ca-c280-4c37-9ca8-1f70128d8ea0","Type":"ContainerDied","Data":"71ac8a177178141e90ccd769af19825d487e308f6aa32117a9c04b942152cbe1"} Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.375073 4823 scope.go:117] "RemoveContainer" containerID="d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.402477 4823 scope.go:117] "RemoveContainer" containerID="4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.435223 4823 scope.go:117] "RemoveContainer" containerID="d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76" Dec 16 08:57:23 crc kubenswrapper[4823]: E1216 08:57:23.436217 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76\": container with ID starting with d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76 not found: ID does not exist" containerID="d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.436265 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76"} err="failed to get container status \"d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76\": rpc error: code = NotFound desc = could not find container \"d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76\": container with ID starting with d0f0770707ebd8a48231bd6e6dcfe5ed733a71c7d4e2e684e8feb19c9fa04b76 not found: ID does not exist" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.436294 4823 scope.go:117] "RemoveContainer" containerID="4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f" Dec 16 08:57:23 crc kubenswrapper[4823]: E1216 08:57:23.440296 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f\": container with ID starting with 4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f not found: ID does not exist" containerID="4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.440348 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f"} err="failed to get container status \"4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f\": rpc error: code = NotFound desc = could not find container \"4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f\": container with ID starting with 4e32be7c834f94d9e93508e5a9ab9f3e7ab42901c1a546af535206c8d8a41a9f not found: ID does not exist" Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.442656 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7785588999-7gvll"] Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.451490 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7785588999-7gvll"] Dec 16 08:57:23 crc kubenswrapper[4823]: I1216 08:57:23.781511 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" path="/var/lib/kubelet/pods/3a9974ca-c280-4c37-9ca8-1f70128d8ea0/volumes" Dec 16 08:57:29 crc kubenswrapper[4823]: I1216 08:57:29.683644 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 08:57:29 crc kubenswrapper[4823]: I1216 08:57:29.774292 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:57:29 crc kubenswrapper[4823]: E1216 08:57:29.774647 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:57:44 crc kubenswrapper[4823]: I1216 08:57:44.771966 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:57:44 crc kubenswrapper[4823]: E1216 08:57:44.772946 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:57:49 crc kubenswrapper[4823]: I1216 08:57:49.562143 4823 scope.go:117] "RemoveContainer" containerID="68ec7a465b0ad6b461f30e4849b6ddb746160aec1ea230537f172432c1a9d391" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.803934 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:57:50 crc kubenswrapper[4823]: E1216 08:57:50.804372 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerName="dnsmasq-dns" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.804385 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerName="dnsmasq-dns" Dec 16 08:57:50 crc kubenswrapper[4823]: E1216 08:57:50.804401 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerName="init" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.804409 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerName="init" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.804575 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9974ca-c280-4c37-9ca8-1f70128d8ea0" containerName="dnsmasq-dns" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.805502 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.808402 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.815814 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.864376 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.864469 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vw8d\" (UniqueName: \"kubernetes.io/projected/503d32be-ac63-4469-ae84-4803a0e6b9fc-kube-api-access-5vw8d\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.864567 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.864620 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.864691 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/503d32be-ac63-4469-ae84-4803a0e6b9fc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.864758 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-scripts\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966221 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-scripts\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966311 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966383 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vw8d\" (UniqueName: \"kubernetes.io/projected/503d32be-ac63-4469-ae84-4803a0e6b9fc-kube-api-access-5vw8d\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966432 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966477 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966526 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/503d32be-ac63-4469-ae84-4803a0e6b9fc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.966648 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/503d32be-ac63-4469-ae84-4803a0e6b9fc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.972510 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-scripts\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.972887 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.973431 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.974078 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:50 crc kubenswrapper[4823]: I1216 08:57:50.986241 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vw8d\" (UniqueName: \"kubernetes.io/projected/503d32be-ac63-4469-ae84-4803a0e6b9fc-kube-api-access-5vw8d\") pod \"cinder-scheduler-0\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " pod="openstack/cinder-scheduler-0" Dec 16 08:57:51 crc kubenswrapper[4823]: I1216 08:57:51.127828 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:57:51 crc kubenswrapper[4823]: I1216 08:57:51.602900 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:57:51 crc kubenswrapper[4823]: I1216 08:57:51.643110 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"503d32be-ac63-4469-ae84-4803a0e6b9fc","Type":"ContainerStarted","Data":"d76cbf9dee88365aaa006bf1d69fd2031400a5abc3fcc843fbe11206f58163d0"} Dec 16 08:57:52 crc kubenswrapper[4823]: I1216 08:57:52.416560 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:52 crc kubenswrapper[4823]: I1216 08:57:52.416821 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api-log" containerID="cri-o://17dc5e4ad2ca29e77d42bc10be3131bbf5a9ee3e0ed5cf04677040d204bf384e" gracePeriod=30 Dec 16 08:57:52 crc kubenswrapper[4823]: I1216 08:57:52.416857 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api" containerID="cri-o://cf5839209453ac08804915e33ce6d089caafeddb0fc0615fadf3d8453a1a6749" gracePeriod=30 Dec 16 08:57:52 crc kubenswrapper[4823]: I1216 08:57:52.667652 4823 generic.go:334] "Generic (PLEG): container finished" podID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerID="17dc5e4ad2ca29e77d42bc10be3131bbf5a9ee3e0ed5cf04677040d204bf384e" exitCode=143 Dec 16 08:57:52 crc kubenswrapper[4823]: I1216 08:57:52.667743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ec64cc79-d75f-4f52-8ec4-22de2801736b","Type":"ContainerDied","Data":"17dc5e4ad2ca29e77d42bc10be3131bbf5a9ee3e0ed5cf04677040d204bf384e"} Dec 16 08:57:53 crc kubenswrapper[4823]: I1216 08:57:53.677081 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"503d32be-ac63-4469-ae84-4803a0e6b9fc","Type":"ContainerStarted","Data":"39dba9ada4c571c4754f4ea9c1d613cc0c6f78e74689ca4e74bb70eeacf19e65"} Dec 16 08:57:53 crc kubenswrapper[4823]: I1216 08:57:53.677407 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"503d32be-ac63-4469-ae84-4803a0e6b9fc","Type":"ContainerStarted","Data":"4d8b8a826aae25bead70daf7c5adce54529e6f05b20874c9f77b8d98d9e28f3d"} Dec 16 08:57:53 crc kubenswrapper[4823]: I1216 08:57:53.714443 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.328925549 podStartE2EDuration="3.714421376s" podCreationTimestamp="2025-12-16 08:57:50 +0000 UTC" firstStartedPulling="2025-12-16 08:57:51.606923221 +0000 UTC m=+7350.095489344" lastFinishedPulling="2025-12-16 08:57:51.992419018 +0000 UTC m=+7350.480985171" observedRunningTime="2025-12-16 08:57:53.70847885 +0000 UTC m=+7352.197044973" watchObservedRunningTime="2025-12-16 08:57:53.714421376 +0000 UTC m=+7352.202987499" Dec 16 08:57:55 crc kubenswrapper[4823]: I1216 08:57:55.585542 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.58:8776/healthcheck\": read tcp 10.217.0.2:45084->10.217.1.58:8776: read: connection reset by peer" Dec 16 08:57:55 crc kubenswrapper[4823]: I1216 08:57:55.697561 4823 generic.go:334] "Generic (PLEG): container finished" podID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerID="cf5839209453ac08804915e33ce6d089caafeddb0fc0615fadf3d8453a1a6749" exitCode=0 Dec 16 08:57:55 crc kubenswrapper[4823]: I1216 08:57:55.697609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ec64cc79-d75f-4f52-8ec4-22de2801736b","Type":"ContainerDied","Data":"cf5839209453ac08804915e33ce6d089caafeddb0fc0615fadf3d8453a1a6749"} Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.034405 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.128504 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174194 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-scripts\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174285 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-internal-tls-certs\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174363 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-public-tls-certs\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174399 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec64cc79-d75f-4f52-8ec4-22de2801736b-logs\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec64cc79-d75f-4f52-8ec4-22de2801736b-etc-machine-id\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174482 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv6qs\" (UniqueName: \"kubernetes.io/projected/ec64cc79-d75f-4f52-8ec4-22de2801736b-kube-api-access-pv6qs\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174561 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data-custom\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174598 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174621 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-combined-ca-bundle\") pod \"ec64cc79-d75f-4f52-8ec4-22de2801736b\" (UID: \"ec64cc79-d75f-4f52-8ec4-22de2801736b\") " Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.174927 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec64cc79-d75f-4f52-8ec4-22de2801736b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.175578 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec64cc79-d75f-4f52-8ec4-22de2801736b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.178374 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec64cc79-d75f-4f52-8ec4-22de2801736b-logs" (OuterVolumeSpecName: "logs") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.182187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec64cc79-d75f-4f52-8ec4-22de2801736b-kube-api-access-pv6qs" (OuterVolumeSpecName: "kube-api-access-pv6qs") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "kube-api-access-pv6qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.184550 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.187449 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-scripts" (OuterVolumeSpecName: "scripts") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.230468 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.234359 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.238490 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data" (OuterVolumeSpecName: "config-data") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.238645 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec64cc79-d75f-4f52-8ec4-22de2801736b" (UID: "ec64cc79-d75f-4f52-8ec4-22de2801736b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277252 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277284 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277296 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277307 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec64cc79-d75f-4f52-8ec4-22de2801736b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277316 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv6qs\" (UniqueName: \"kubernetes.io/projected/ec64cc79-d75f-4f52-8ec4-22de2801736b-kube-api-access-pv6qs\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277324 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277333 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.277341 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec64cc79-d75f-4f52-8ec4-22de2801736b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.705587 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ec64cc79-d75f-4f52-8ec4-22de2801736b","Type":"ContainerDied","Data":"e1b4f64b72edb20892dc4152eae71b4419fb662959f16c94eb6283fce43e5f48"} Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.705635 4823 scope.go:117] "RemoveContainer" containerID="cf5839209453ac08804915e33ce6d089caafeddb0fc0615fadf3d8453a1a6749" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.705723 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.748287 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.748656 4823 scope.go:117] "RemoveContainer" containerID="17dc5e4ad2ca29e77d42bc10be3131bbf5a9ee3e0ed5cf04677040d204bf384e" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.768710 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.772133 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:57:56 crc kubenswrapper[4823]: E1216 08:57:56.772414 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.782715 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:56 crc kubenswrapper[4823]: E1216 08:57:56.783189 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api-log" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.783210 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api-log" Dec 16 08:57:56 crc kubenswrapper[4823]: E1216 08:57:56.783233 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.783240 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.783477 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.783498 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" containerName="cinder-api-log" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.784614 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.791665 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.811975 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.812529 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.812674 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887250 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f5097e-d643-4598-9d06-39f14f913291-logs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887473 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-public-tls-certs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887534 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mvb\" (UniqueName: \"kubernetes.io/projected/91f5097e-d643-4598-9d06-39f14f913291-kube-api-access-w4mvb\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887579 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887607 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887637 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data-custom\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887680 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91f5097e-d643-4598-9d06-39f14f913291-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887748 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.887794 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-scripts\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990109 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-public-tls-certs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990192 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mvb\" (UniqueName: \"kubernetes.io/projected/91f5097e-d643-4598-9d06-39f14f913291-kube-api-access-w4mvb\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990232 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990264 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990300 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data-custom\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990335 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91f5097e-d643-4598-9d06-39f14f913291-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990384 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990415 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-scripts\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990446 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f5097e-d643-4598-9d06-39f14f913291-logs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990756 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91f5097e-d643-4598-9d06-39f14f913291-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.990956 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f5097e-d643-4598-9d06-39f14f913291-logs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.994094 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:56 crc kubenswrapper[4823]: I1216 08:57:56.994705 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-public-tls-certs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:56.997161 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:56.997902 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-scripts\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:56.997949 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data-custom\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:57.002115 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:57.010721 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mvb\" (UniqueName: \"kubernetes.io/projected/91f5097e-d643-4598-9d06-39f14f913291-kube-api-access-w4mvb\") pod \"cinder-api-0\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:57.143423 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:57.600438 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:57:57 crc kubenswrapper[4823]: W1216 08:57:57.609049 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f5097e_d643_4598_9d06_39f14f913291.slice/crio-20086cec8f2868089f12b793bf9b74ce7e74dea7026f0198636653acaf62d2c2 WatchSource:0}: Error finding container 20086cec8f2868089f12b793bf9b74ce7e74dea7026f0198636653acaf62d2c2: Status 404 returned error can't find the container with id 20086cec8f2868089f12b793bf9b74ce7e74dea7026f0198636653acaf62d2c2 Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:57.752928 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91f5097e-d643-4598-9d06-39f14f913291","Type":"ContainerStarted","Data":"20086cec8f2868089f12b793bf9b74ce7e74dea7026f0198636653acaf62d2c2"} Dec 16 08:57:57 crc kubenswrapper[4823]: I1216 08:57:57.793275 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec64cc79-d75f-4f52-8ec4-22de2801736b" path="/var/lib/kubelet/pods/ec64cc79-d75f-4f52-8ec4-22de2801736b/volumes" Dec 16 08:57:58 crc kubenswrapper[4823]: I1216 08:57:58.764403 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91f5097e-d643-4598-9d06-39f14f913291","Type":"ContainerStarted","Data":"30fdfe487e62f583768c370c7e485f339f7b652c87130a0e97f5217998c63ec1"} Dec 16 08:57:59 crc kubenswrapper[4823]: I1216 08:57:59.787839 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91f5097e-d643-4598-9d06-39f14f913291","Type":"ContainerStarted","Data":"77f7563a34f0a287066ce2a8f04f92118c66c8f3bccb6cfd97b6587b049219b8"} Dec 16 08:57:59 crc kubenswrapper[4823]: I1216 08:57:59.788262 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 08:57:59 crc kubenswrapper[4823]: I1216 08:57:59.829534 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.829145527 podStartE2EDuration="3.829145527s" podCreationTimestamp="2025-12-16 08:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:57:59.824149421 +0000 UTC m=+7358.312715564" watchObservedRunningTime="2025-12-16 08:57:59.829145527 +0000 UTC m=+7358.317711660" Dec 16 08:58:01 crc kubenswrapper[4823]: I1216 08:58:01.332446 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 08:58:01 crc kubenswrapper[4823]: I1216 08:58:01.386767 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:58:01 crc kubenswrapper[4823]: I1216 08:58:01.801690 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="cinder-scheduler" containerID="cri-o://4d8b8a826aae25bead70daf7c5adce54529e6f05b20874c9f77b8d98d9e28f3d" gracePeriod=30 Dec 16 08:58:01 crc kubenswrapper[4823]: I1216 08:58:01.801740 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="probe" containerID="cri-o://39dba9ada4c571c4754f4ea9c1d613cc0c6f78e74689ca4e74bb70eeacf19e65" gracePeriod=30 Dec 16 08:58:02 crc kubenswrapper[4823]: I1216 08:58:02.811431 4823 generic.go:334] "Generic (PLEG): container finished" podID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerID="39dba9ada4c571c4754f4ea9c1d613cc0c6f78e74689ca4e74bb70eeacf19e65" exitCode=0 Dec 16 08:58:02 crc kubenswrapper[4823]: I1216 08:58:02.811524 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"503d32be-ac63-4469-ae84-4803a0e6b9fc","Type":"ContainerDied","Data":"39dba9ada4c571c4754f4ea9c1d613cc0c6f78e74689ca4e74bb70eeacf19e65"} Dec 16 08:58:03 crc kubenswrapper[4823]: I1216 08:58:03.823497 4823 generic.go:334] "Generic (PLEG): container finished" podID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerID="4d8b8a826aae25bead70daf7c5adce54529e6f05b20874c9f77b8d98d9e28f3d" exitCode=0 Dec 16 08:58:03 crc kubenswrapper[4823]: I1216 08:58:03.823829 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"503d32be-ac63-4469-ae84-4803a0e6b9fc","Type":"ContainerDied","Data":"4d8b8a826aae25bead70daf7c5adce54529e6f05b20874c9f77b8d98d9e28f3d"} Dec 16 08:58:03 crc kubenswrapper[4823]: I1216 08:58:03.948718 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.020638 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data\") pod \"503d32be-ac63-4469-ae84-4803a0e6b9fc\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.020687 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-scripts\") pod \"503d32be-ac63-4469-ae84-4803a0e6b9fc\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.020716 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/503d32be-ac63-4469-ae84-4803a0e6b9fc-etc-machine-id\") pod \"503d32be-ac63-4469-ae84-4803a0e6b9fc\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.020774 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-combined-ca-bundle\") pod \"503d32be-ac63-4469-ae84-4803a0e6b9fc\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.020915 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vw8d\" (UniqueName: \"kubernetes.io/projected/503d32be-ac63-4469-ae84-4803a0e6b9fc-kube-api-access-5vw8d\") pod \"503d32be-ac63-4469-ae84-4803a0e6b9fc\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.020959 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data-custom\") pod \"503d32be-ac63-4469-ae84-4803a0e6b9fc\" (UID: \"503d32be-ac63-4469-ae84-4803a0e6b9fc\") " Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.021614 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/503d32be-ac63-4469-ae84-4803a0e6b9fc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "503d32be-ac63-4469-ae84-4803a0e6b9fc" (UID: "503d32be-ac63-4469-ae84-4803a0e6b9fc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.029386 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "503d32be-ac63-4469-ae84-4803a0e6b9fc" (UID: "503d32be-ac63-4469-ae84-4803a0e6b9fc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.029943 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503d32be-ac63-4469-ae84-4803a0e6b9fc-kube-api-access-5vw8d" (OuterVolumeSpecName: "kube-api-access-5vw8d") pod "503d32be-ac63-4469-ae84-4803a0e6b9fc" (UID: "503d32be-ac63-4469-ae84-4803a0e6b9fc"). InnerVolumeSpecName "kube-api-access-5vw8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.038152 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-scripts" (OuterVolumeSpecName: "scripts") pod "503d32be-ac63-4469-ae84-4803a0e6b9fc" (UID: "503d32be-ac63-4469-ae84-4803a0e6b9fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.105308 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "503d32be-ac63-4469-ae84-4803a0e6b9fc" (UID: "503d32be-ac63-4469-ae84-4803a0e6b9fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.125382 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.125430 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/503d32be-ac63-4469-ae84-4803a0e6b9fc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.125443 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.125457 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vw8d\" (UniqueName: \"kubernetes.io/projected/503d32be-ac63-4469-ae84-4803a0e6b9fc-kube-api-access-5vw8d\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.125468 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.125905 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data" (OuterVolumeSpecName: "config-data") pod "503d32be-ac63-4469-ae84-4803a0e6b9fc" (UID: "503d32be-ac63-4469-ae84-4803a0e6b9fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.231129 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/503d32be-ac63-4469-ae84-4803a0e6b9fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.833623 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"503d32be-ac63-4469-ae84-4803a0e6b9fc","Type":"ContainerDied","Data":"d76cbf9dee88365aaa006bf1d69fd2031400a5abc3fcc843fbe11206f58163d0"} Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.833719 4823 scope.go:117] "RemoveContainer" containerID="39dba9ada4c571c4754f4ea9c1d613cc0c6f78e74689ca4e74bb70eeacf19e65" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.833737 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.855779 4823 scope.go:117] "RemoveContainer" containerID="4d8b8a826aae25bead70daf7c5adce54529e6f05b20874c9f77b8d98d9e28f3d" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.874963 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.893017 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.904730 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:58:04 crc kubenswrapper[4823]: E1216 08:58:04.905247 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="probe" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.905275 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="probe" Dec 16 08:58:04 crc kubenswrapper[4823]: E1216 08:58:04.905311 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="cinder-scheduler" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.905320 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="cinder-scheduler" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.905535 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="cinder-scheduler" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.905559 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" containerName="probe" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.906723 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.909387 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 08:58:04 crc kubenswrapper[4823]: I1216 08:58:04.922841 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.048283 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.048757 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.048999 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.049363 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.049495 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.049613 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqpfd\" (UniqueName: \"kubernetes.io/projected/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-kube-api-access-dqpfd\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.151790 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.151866 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.151937 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.151952 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.151979 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.152007 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqpfd\" (UniqueName: \"kubernetes.io/projected/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-kube-api-access-dqpfd\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.152049 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.157363 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.157930 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.159437 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.160517 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.171077 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqpfd\" (UniqueName: \"kubernetes.io/projected/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-kube-api-access-dqpfd\") pod \"cinder-scheduler-0\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.225712 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.739070 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:58:05 crc kubenswrapper[4823]: W1216 08:58:05.747633 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcffdbd32_0155_4dd0_897d_9e406fd5e2ee.slice/crio-33a6413e072e3077978a096fa163d1cdb4054acf5bfd1877efc9c4bde951a128 WatchSource:0}: Error finding container 33a6413e072e3077978a096fa163d1cdb4054acf5bfd1877efc9c4bde951a128: Status 404 returned error can't find the container with id 33a6413e072e3077978a096fa163d1cdb4054acf5bfd1877efc9c4bde951a128 Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.784984 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503d32be-ac63-4469-ae84-4803a0e6b9fc" path="/var/lib/kubelet/pods/503d32be-ac63-4469-ae84-4803a0e6b9fc/volumes" Dec 16 08:58:05 crc kubenswrapper[4823]: I1216 08:58:05.849386 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cffdbd32-0155-4dd0-897d-9e406fd5e2ee","Type":"ContainerStarted","Data":"33a6413e072e3077978a096fa163d1cdb4054acf5bfd1877efc9c4bde951a128"} Dec 16 08:58:06 crc kubenswrapper[4823]: I1216 08:58:06.891414 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cffdbd32-0155-4dd0-897d-9e406fd5e2ee","Type":"ContainerStarted","Data":"03aaea60579a32dbd22e959a4c109e38b799c758c8b0d9ef37082c0af8297906"} Dec 16 08:58:07 crc kubenswrapper[4823]: I1216 08:58:07.905014 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cffdbd32-0155-4dd0-897d-9e406fd5e2ee","Type":"ContainerStarted","Data":"25018dc3f33bf4fbcc5228605d9875ddf4805fa913e97070453b8841fe915d79"} Dec 16 08:58:07 crc kubenswrapper[4823]: I1216 08:58:07.932717 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.932690408 podStartE2EDuration="3.932690408s" podCreationTimestamp="2025-12-16 08:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:58:07.928944331 +0000 UTC m=+7366.417510454" watchObservedRunningTime="2025-12-16 08:58:07.932690408 +0000 UTC m=+7366.421256541" Dec 16 08:58:08 crc kubenswrapper[4823]: I1216 08:58:08.976297 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 08:58:10 crc kubenswrapper[4823]: I1216 08:58:10.225838 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 08:58:10 crc kubenswrapper[4823]: I1216 08:58:10.775914 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 08:58:11 crc kubenswrapper[4823]: I1216 08:58:11.944347 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"c4d9ea4299c018a902750aabeef9dea06ce13b6e55f03c5913f1f492b4b19163"} Dec 16 08:58:15 crc kubenswrapper[4823]: I1216 08:58:15.471202 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.494824 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n2f4r"] Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.496335 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.508412 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n2f4r"] Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.595741 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4d28-account-create-update-zfk4f"] Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.596996 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.602199 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.616998 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4d28-account-create-update-zfk4f"] Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.647932 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbtbw\" (UniqueName: \"kubernetes.io/projected/eacf4276-abf0-43b1-b50b-31b9a98fd977-kube-api-access-dbtbw\") pod \"glance-db-create-n2f4r\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.648011 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eacf4276-abf0-43b1-b50b-31b9a98fd977-operator-scripts\") pod \"glance-db-create-n2f4r\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.750468 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbtbw\" (UniqueName: \"kubernetes.io/projected/eacf4276-abf0-43b1-b50b-31b9a98fd977-kube-api-access-dbtbw\") pod \"glance-db-create-n2f4r\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.750621 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eacf4276-abf0-43b1-b50b-31b9a98fd977-operator-scripts\") pod \"glance-db-create-n2f4r\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.750674 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-operator-scripts\") pod \"glance-4d28-account-create-update-zfk4f\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.750727 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvnxp\" (UniqueName: \"kubernetes.io/projected/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-kube-api-access-lvnxp\") pod \"glance-4d28-account-create-update-zfk4f\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.751436 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eacf4276-abf0-43b1-b50b-31b9a98fd977-operator-scripts\") pod \"glance-db-create-n2f4r\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.780506 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbtbw\" (UniqueName: \"kubernetes.io/projected/eacf4276-abf0-43b1-b50b-31b9a98fd977-kube-api-access-dbtbw\") pod \"glance-db-create-n2f4r\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.834794 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.853264 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-operator-scripts\") pod \"glance-4d28-account-create-update-zfk4f\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.853369 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnxp\" (UniqueName: \"kubernetes.io/projected/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-kube-api-access-lvnxp\") pod \"glance-4d28-account-create-update-zfk4f\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.854267 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-operator-scripts\") pod \"glance-4d28-account-create-update-zfk4f\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.873660 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnxp\" (UniqueName: \"kubernetes.io/projected/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-kube-api-access-lvnxp\") pod \"glance-4d28-account-create-update-zfk4f\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:17 crc kubenswrapper[4823]: I1216 08:58:17.920842 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:18 crc kubenswrapper[4823]: W1216 08:58:18.339922 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeacf4276_abf0_43b1_b50b_31b9a98fd977.slice/crio-d5382960d1099a23a4fc348d0ca6a2214e9059f3c7a49f75f4c2fb60a2a504bc WatchSource:0}: Error finding container d5382960d1099a23a4fc348d0ca6a2214e9059f3c7a49f75f4c2fb60a2a504bc: Status 404 returned error can't find the container with id d5382960d1099a23a4fc348d0ca6a2214e9059f3c7a49f75f4c2fb60a2a504bc Dec 16 08:58:18 crc kubenswrapper[4823]: I1216 08:58:18.344846 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n2f4r"] Dec 16 08:58:18 crc kubenswrapper[4823]: W1216 08:58:18.438280 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e289ad3_49d6_41ac_aa1f_36e839cb4dfe.slice/crio-895c2f91ba7ae419057b21f219b0e732b551a2570a13e3931611e9deaae0543f WatchSource:0}: Error finding container 895c2f91ba7ae419057b21f219b0e732b551a2570a13e3931611e9deaae0543f: Status 404 returned error can't find the container with id 895c2f91ba7ae419057b21f219b0e732b551a2570a13e3931611e9deaae0543f Dec 16 08:58:18 crc kubenswrapper[4823]: I1216 08:58:18.443599 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4d28-account-create-update-zfk4f"] Dec 16 08:58:19 crc kubenswrapper[4823]: I1216 08:58:19.031531 4823 generic.go:334] "Generic (PLEG): container finished" podID="4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" containerID="68397253796aa62e916c2caa7a1ff23231549f5a3cf2a3f3169458d72a61f5fd" exitCode=0 Dec 16 08:58:19 crc kubenswrapper[4823]: I1216 08:58:19.031612 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4d28-account-create-update-zfk4f" event={"ID":"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe","Type":"ContainerDied","Data":"68397253796aa62e916c2caa7a1ff23231549f5a3cf2a3f3169458d72a61f5fd"} Dec 16 08:58:19 crc kubenswrapper[4823]: I1216 08:58:19.031645 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4d28-account-create-update-zfk4f" event={"ID":"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe","Type":"ContainerStarted","Data":"895c2f91ba7ae419057b21f219b0e732b551a2570a13e3931611e9deaae0543f"} Dec 16 08:58:19 crc kubenswrapper[4823]: I1216 08:58:19.033696 4823 generic.go:334] "Generic (PLEG): container finished" podID="eacf4276-abf0-43b1-b50b-31b9a98fd977" containerID="24328ddc822f8342c18736f115585ba59a7d68d44674cfb10d0d532a099ae439" exitCode=0 Dec 16 08:58:19 crc kubenswrapper[4823]: I1216 08:58:19.033747 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n2f4r" event={"ID":"eacf4276-abf0-43b1-b50b-31b9a98fd977","Type":"ContainerDied","Data":"24328ddc822f8342c18736f115585ba59a7d68d44674cfb10d0d532a099ae439"} Dec 16 08:58:19 crc kubenswrapper[4823]: I1216 08:58:19.033779 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n2f4r" event={"ID":"eacf4276-abf0-43b1-b50b-31b9a98fd977","Type":"ContainerStarted","Data":"d5382960d1099a23a4fc348d0ca6a2214e9059f3c7a49f75f4c2fb60a2a504bc"} Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.425795 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.433327 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.504004 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eacf4276-abf0-43b1-b50b-31b9a98fd977-operator-scripts\") pod \"eacf4276-abf0-43b1-b50b-31b9a98fd977\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.504221 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbtbw\" (UniqueName: \"kubernetes.io/projected/eacf4276-abf0-43b1-b50b-31b9a98fd977-kube-api-access-dbtbw\") pod \"eacf4276-abf0-43b1-b50b-31b9a98fd977\" (UID: \"eacf4276-abf0-43b1-b50b-31b9a98fd977\") " Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.504515 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eacf4276-abf0-43b1-b50b-31b9a98fd977-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eacf4276-abf0-43b1-b50b-31b9a98fd977" (UID: "eacf4276-abf0-43b1-b50b-31b9a98fd977"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.504657 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eacf4276-abf0-43b1-b50b-31b9a98fd977-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.513240 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eacf4276-abf0-43b1-b50b-31b9a98fd977-kube-api-access-dbtbw" (OuterVolumeSpecName: "kube-api-access-dbtbw") pod "eacf4276-abf0-43b1-b50b-31b9a98fd977" (UID: "eacf4276-abf0-43b1-b50b-31b9a98fd977"). InnerVolumeSpecName "kube-api-access-dbtbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.606801 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvnxp\" (UniqueName: \"kubernetes.io/projected/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-kube-api-access-lvnxp\") pod \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.606892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-operator-scripts\") pod \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\" (UID: \"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe\") " Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.607368 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbtbw\" (UniqueName: \"kubernetes.io/projected/eacf4276-abf0-43b1-b50b-31b9a98fd977-kube-api-access-dbtbw\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.607469 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" (UID: "4e289ad3-49d6-41ac-aa1f-36e839cb4dfe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.610996 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-kube-api-access-lvnxp" (OuterVolumeSpecName: "kube-api-access-lvnxp") pod "4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" (UID: "4e289ad3-49d6-41ac-aa1f-36e839cb4dfe"). InnerVolumeSpecName "kube-api-access-lvnxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.708849 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvnxp\" (UniqueName: \"kubernetes.io/projected/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-kube-api-access-lvnxp\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:20 crc kubenswrapper[4823]: I1216 08:58:20.708888 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:21 crc kubenswrapper[4823]: I1216 08:58:21.051614 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n2f4r" event={"ID":"eacf4276-abf0-43b1-b50b-31b9a98fd977","Type":"ContainerDied","Data":"d5382960d1099a23a4fc348d0ca6a2214e9059f3c7a49f75f4c2fb60a2a504bc"} Dec 16 08:58:21 crc kubenswrapper[4823]: I1216 08:58:21.051924 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5382960d1099a23a4fc348d0ca6a2214e9059f3c7a49f75f4c2fb60a2a504bc" Dec 16 08:58:21 crc kubenswrapper[4823]: I1216 08:58:21.051889 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2f4r" Dec 16 08:58:21 crc kubenswrapper[4823]: I1216 08:58:21.053851 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4d28-account-create-update-zfk4f" event={"ID":"4e289ad3-49d6-41ac-aa1f-36e839cb4dfe","Type":"ContainerDied","Data":"895c2f91ba7ae419057b21f219b0e732b551a2570a13e3931611e9deaae0543f"} Dec 16 08:58:21 crc kubenswrapper[4823]: I1216 08:58:21.053912 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="895c2f91ba7ae419057b21f219b0e732b551a2570a13e3931611e9deaae0543f" Dec 16 08:58:21 crc kubenswrapper[4823]: I1216 08:58:21.053942 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4d28-account-create-update-zfk4f" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.754708 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-52nj6"] Dec 16 08:58:22 crc kubenswrapper[4823]: E1216 08:58:22.755385 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eacf4276-abf0-43b1-b50b-31b9a98fd977" containerName="mariadb-database-create" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.755403 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="eacf4276-abf0-43b1-b50b-31b9a98fd977" containerName="mariadb-database-create" Dec 16 08:58:22 crc kubenswrapper[4823]: E1216 08:58:22.755429 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" containerName="mariadb-account-create-update" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.755437 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" containerName="mariadb-account-create-update" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.755645 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" containerName="mariadb-account-create-update" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.755670 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="eacf4276-abf0-43b1-b50b-31b9a98fd977" containerName="mariadb-database-create" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.756329 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.759418 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kvvk8" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.759719 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.769279 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-52nj6"] Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.844153 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-db-sync-config-data\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.844281 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-combined-ca-bundle\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.844429 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vxh\" (UniqueName: \"kubernetes.io/projected/d15835b7-f5d1-4b98-aeff-14eea3529691-kube-api-access-49vxh\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.845079 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-config-data\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.946796 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-db-sync-config-data\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.946879 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-combined-ca-bundle\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.946970 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vxh\" (UniqueName: \"kubernetes.io/projected/d15835b7-f5d1-4b98-aeff-14eea3529691-kube-api-access-49vxh\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.947015 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-config-data\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.952247 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-db-sync-config-data\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.952911 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-combined-ca-bundle\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.968608 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-config-data\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:22 crc kubenswrapper[4823]: I1216 08:58:22.969010 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vxh\" (UniqueName: \"kubernetes.io/projected/d15835b7-f5d1-4b98-aeff-14eea3529691-kube-api-access-49vxh\") pod \"glance-db-sync-52nj6\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:23 crc kubenswrapper[4823]: I1216 08:58:23.076993 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:23 crc kubenswrapper[4823]: I1216 08:58:23.517449 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-52nj6"] Dec 16 08:58:24 crc kubenswrapper[4823]: I1216 08:58:24.080218 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-52nj6" event={"ID":"d15835b7-f5d1-4b98-aeff-14eea3529691","Type":"ContainerStarted","Data":"d03a48486e0f62c020cb035940cb986d2b2084622a159233a140e2bd30c9dab2"} Dec 16 08:58:41 crc kubenswrapper[4823]: E1216 08:58:41.930703 4823 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-glance-api:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:58:41 crc kubenswrapper[4823]: E1216 08:58:41.931318 4823 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-glance-api:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:58:41 crc kubenswrapper[4823]: E1216 08:58:41.931459 4823 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-glance-api:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-49vxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-52nj6_openstack(d15835b7-f5d1-4b98-aeff-14eea3529691): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:58:41 crc kubenswrapper[4823]: E1216 08:58:41.932667 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-52nj6" podUID="d15835b7-f5d1-4b98-aeff-14eea3529691" Dec 16 08:58:42 crc kubenswrapper[4823]: E1216 08:58:42.260905 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-glance-api:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/glance-db-sync-52nj6" podUID="d15835b7-f5d1-4b98-aeff-14eea3529691" Dec 16 08:58:55 crc kubenswrapper[4823]: I1216 08:58:55.375850 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-52nj6" event={"ID":"d15835b7-f5d1-4b98-aeff-14eea3529691","Type":"ContainerStarted","Data":"e2ef74bd152894e3c34c3052ee9a50d3a4d09fc64d0cd690fa05f9c1d68b0731"} Dec 16 08:58:55 crc kubenswrapper[4823]: I1216 08:58:55.394388 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-52nj6" podStartSLOduration=2.941388304 podStartE2EDuration="33.394367792s" podCreationTimestamp="2025-12-16 08:58:22 +0000 UTC" firstStartedPulling="2025-12-16 08:58:23.52900049 +0000 UTC m=+7382.017566613" lastFinishedPulling="2025-12-16 08:58:53.981979978 +0000 UTC m=+7412.470546101" observedRunningTime="2025-12-16 08:58:55.394038921 +0000 UTC m=+7413.882605044" watchObservedRunningTime="2025-12-16 08:58:55.394367792 +0000 UTC m=+7413.882933905" Dec 16 08:58:58 crc kubenswrapper[4823]: I1216 08:58:58.402123 4823 generic.go:334] "Generic (PLEG): container finished" podID="d15835b7-f5d1-4b98-aeff-14eea3529691" containerID="e2ef74bd152894e3c34c3052ee9a50d3a4d09fc64d0cd690fa05f9c1d68b0731" exitCode=0 Dec 16 08:58:58 crc kubenswrapper[4823]: I1216 08:58:58.402223 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-52nj6" event={"ID":"d15835b7-f5d1-4b98-aeff-14eea3529691","Type":"ContainerDied","Data":"e2ef74bd152894e3c34c3052ee9a50d3a4d09fc64d0cd690fa05f9c1d68b0731"} Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.874229 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-52nj6" Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.948421 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49vxh\" (UniqueName: \"kubernetes.io/projected/d15835b7-f5d1-4b98-aeff-14eea3529691-kube-api-access-49vxh\") pod \"d15835b7-f5d1-4b98-aeff-14eea3529691\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.948595 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-db-sync-config-data\") pod \"d15835b7-f5d1-4b98-aeff-14eea3529691\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.948776 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-config-data\") pod \"d15835b7-f5d1-4b98-aeff-14eea3529691\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.948923 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-combined-ca-bundle\") pod \"d15835b7-f5d1-4b98-aeff-14eea3529691\" (UID: \"d15835b7-f5d1-4b98-aeff-14eea3529691\") " Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.954887 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d15835b7-f5d1-4b98-aeff-14eea3529691" (UID: "d15835b7-f5d1-4b98-aeff-14eea3529691"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.956267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15835b7-f5d1-4b98-aeff-14eea3529691-kube-api-access-49vxh" (OuterVolumeSpecName: "kube-api-access-49vxh") pod "d15835b7-f5d1-4b98-aeff-14eea3529691" (UID: "d15835b7-f5d1-4b98-aeff-14eea3529691"). InnerVolumeSpecName "kube-api-access-49vxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:58:59 crc kubenswrapper[4823]: I1216 08:58:59.989239 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d15835b7-f5d1-4b98-aeff-14eea3529691" (UID: "d15835b7-f5d1-4b98-aeff-14eea3529691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.008073 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-config-data" (OuterVolumeSpecName: "config-data") pod "d15835b7-f5d1-4b98-aeff-14eea3529691" (UID: "d15835b7-f5d1-4b98-aeff-14eea3529691"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.050835 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.050875 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49vxh\" (UniqueName: \"kubernetes.io/projected/d15835b7-f5d1-4b98-aeff-14eea3529691-kube-api-access-49vxh\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.050884 4823 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.050896 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15835b7-f5d1-4b98-aeff-14eea3529691-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.425413 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-52nj6" event={"ID":"d15835b7-f5d1-4b98-aeff-14eea3529691","Type":"ContainerDied","Data":"d03a48486e0f62c020cb035940cb986d2b2084622a159233a140e2bd30c9dab2"} Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.425468 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03a48486e0f62c020cb035940cb986d2b2084622a159233a140e2bd30c9dab2" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.425469 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-52nj6" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.803245 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:00 crc kubenswrapper[4823]: E1216 08:59:00.810262 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15835b7-f5d1-4b98-aeff-14eea3529691" containerName="glance-db-sync" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.810322 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15835b7-f5d1-4b98-aeff-14eea3529691" containerName="glance-db-sync" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.810716 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15835b7-f5d1-4b98-aeff-14eea3529691" containerName="glance-db-sync" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.812495 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.816794 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.817390 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kvvk8" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.825806 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.851452 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.928561 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98894d689-8clmb"] Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.932517 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.944876 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98894d689-8clmb"] Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.983989 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-config-data\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.984106 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-scripts\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.985064 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.985336 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-logs\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.985422 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdt4j\" (UniqueName: \"kubernetes.io/projected/561df03c-9fa7-42d8-a071-e4972b688509-kube-api-access-mdt4j\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:00 crc kubenswrapper[4823]: I1216 08:59:00.985479 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086782 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-logs\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086846 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdt4j\" (UniqueName: \"kubernetes.io/projected/561df03c-9fa7-42d8-a071-e4972b688509-kube-api-access-mdt4j\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086872 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmgzb\" (UniqueName: \"kubernetes.io/projected/4b6a99be-d903-4ce7-9832-6a085da5277e-kube-api-access-pmgzb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086897 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086921 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-config\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086943 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-dns-svc\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.086979 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-config-data\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.087000 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-sb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.087044 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-scripts\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.087080 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.087119 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-nb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.088075 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-logs\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.089758 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.094990 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-scripts\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.096182 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.114634 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-config-data\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.123758 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdt4j\" (UniqueName: \"kubernetes.io/projected/561df03c-9fa7-42d8-a071-e4972b688509-kube-api-access-mdt4j\") pod \"glance-default-external-api-0\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.147250 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.167116 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.177574 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.181766 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190360 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190430 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmgzb\" (UniqueName: \"kubernetes.io/projected/4b6a99be-d903-4ce7-9832-6a085da5277e-kube-api-access-pmgzb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190464 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwft\" (UniqueName: \"kubernetes.io/projected/81379994-dabf-40db-b2fa-ad5e35e443e0-kube-api-access-ptwft\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190490 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-config\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190517 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-dns-svc\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190571 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-sb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190607 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190638 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190673 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190727 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.190761 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-nb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.191905 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-nb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.193113 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-sb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.193652 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-config\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.193783 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-dns-svc\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.223295 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.231624 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmgzb\" (UniqueName: \"kubernetes.io/projected/4b6a99be-d903-4ce7-9832-6a085da5277e-kube-api-access-pmgzb\") pod \"dnsmasq-dns-98894d689-8clmb\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.287009 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.292381 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwft\" (UniqueName: \"kubernetes.io/projected/81379994-dabf-40db-b2fa-ad5e35e443e0-kube-api-access-ptwft\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.292527 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.292564 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.292608 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.292668 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.292738 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.293854 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.294319 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.301611 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.315730 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.316772 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwft\" (UniqueName: \"kubernetes.io/projected/81379994-dabf-40db-b2fa-ad5e35e443e0-kube-api-access-ptwft\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.339210 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.613007 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.876905 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:01 crc kubenswrapper[4823]: I1216 08:59:01.912654 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98894d689-8clmb"] Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.293200 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.301917 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:02 crc kubenswrapper[4823]: W1216 08:59:02.333254 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81379994_dabf_40db_b2fa_ad5e35e443e0.slice/crio-91c910172b3d33928d9d57f46fa3ece3c8ac46de682513ce3fa677b297bb5494 WatchSource:0}: Error finding container 91c910172b3d33928d9d57f46fa3ece3c8ac46de682513ce3fa677b297bb5494: Status 404 returned error can't find the container with id 91c910172b3d33928d9d57f46fa3ece3c8ac46de682513ce3fa677b297bb5494 Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.470064 4823 generic.go:334] "Generic (PLEG): container finished" podID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerID="f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c" exitCode=0 Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.470184 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98894d689-8clmb" event={"ID":"4b6a99be-d903-4ce7-9832-6a085da5277e","Type":"ContainerDied","Data":"f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c"} Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.470259 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98894d689-8clmb" event={"ID":"4b6a99be-d903-4ce7-9832-6a085da5277e","Type":"ContainerStarted","Data":"160a60b436a004b1393bf0b5f79e01488966598514f641ca2a602b1440de9233"} Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.472730 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81379994-dabf-40db-b2fa-ad5e35e443e0","Type":"ContainerStarted","Data":"91c910172b3d33928d9d57f46fa3ece3c8ac46de682513ce3fa677b297bb5494"} Dec 16 08:59:02 crc kubenswrapper[4823]: I1216 08:59:02.476270 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"561df03c-9fa7-42d8-a071-e4972b688509","Type":"ContainerStarted","Data":"422ed5fdf0b595921f061ae4052ab3d858d5a35d0d6b920223679aaf16d25cd1"} Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.505471 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"561df03c-9fa7-42d8-a071-e4972b688509","Type":"ContainerStarted","Data":"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37"} Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.506012 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"561df03c-9fa7-42d8-a071-e4972b688509","Type":"ContainerStarted","Data":"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce"} Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.506349 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-log" containerID="cri-o://71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce" gracePeriod=30 Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.506767 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-httpd" containerID="cri-o://4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37" gracePeriod=30 Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.515490 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98894d689-8clmb" event={"ID":"4b6a99be-d903-4ce7-9832-6a085da5277e","Type":"ContainerStarted","Data":"537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b"} Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.516618 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.522442 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81379994-dabf-40db-b2fa-ad5e35e443e0","Type":"ContainerStarted","Data":"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc"} Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.547840 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.547817465 podStartE2EDuration="3.547817465s" podCreationTimestamp="2025-12-16 08:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:03.541638582 +0000 UTC m=+7422.030204715" watchObservedRunningTime="2025-12-16 08:59:03.547817465 +0000 UTC m=+7422.036383588" Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.582327 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98894d689-8clmb" podStartSLOduration=3.582302125 podStartE2EDuration="3.582302125s" podCreationTimestamp="2025-12-16 08:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:03.579974091 +0000 UTC m=+7422.068540214" watchObservedRunningTime="2025-12-16 08:59:03.582302125 +0000 UTC m=+7422.070868248" Dec 16 08:59:03 crc kubenswrapper[4823]: I1216 08:59:03.699264 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.257214 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.389705 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-combined-ca-bundle\") pod \"561df03c-9fa7-42d8-a071-e4972b688509\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.389775 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-config-data\") pod \"561df03c-9fa7-42d8-a071-e4972b688509\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.389861 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-httpd-run\") pod \"561df03c-9fa7-42d8-a071-e4972b688509\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.389972 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdt4j\" (UniqueName: \"kubernetes.io/projected/561df03c-9fa7-42d8-a071-e4972b688509-kube-api-access-mdt4j\") pod \"561df03c-9fa7-42d8-a071-e4972b688509\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.390054 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-logs\") pod \"561df03c-9fa7-42d8-a071-e4972b688509\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.390088 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-scripts\") pod \"561df03c-9fa7-42d8-a071-e4972b688509\" (UID: \"561df03c-9fa7-42d8-a071-e4972b688509\") " Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.390624 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "561df03c-9fa7-42d8-a071-e4972b688509" (UID: "561df03c-9fa7-42d8-a071-e4972b688509"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.390884 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.391283 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-logs" (OuterVolumeSpecName: "logs") pod "561df03c-9fa7-42d8-a071-e4972b688509" (UID: "561df03c-9fa7-42d8-a071-e4972b688509"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.396917 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561df03c-9fa7-42d8-a071-e4972b688509-kube-api-access-mdt4j" (OuterVolumeSpecName: "kube-api-access-mdt4j") pod "561df03c-9fa7-42d8-a071-e4972b688509" (UID: "561df03c-9fa7-42d8-a071-e4972b688509"). InnerVolumeSpecName "kube-api-access-mdt4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.399367 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-scripts" (OuterVolumeSpecName: "scripts") pod "561df03c-9fa7-42d8-a071-e4972b688509" (UID: "561df03c-9fa7-42d8-a071-e4972b688509"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.422247 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "561df03c-9fa7-42d8-a071-e4972b688509" (UID: "561df03c-9fa7-42d8-a071-e4972b688509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.447253 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-config-data" (OuterVolumeSpecName: "config-data") pod "561df03c-9fa7-42d8-a071-e4972b688509" (UID: "561df03c-9fa7-42d8-a071-e4972b688509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.493961 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdt4j\" (UniqueName: \"kubernetes.io/projected/561df03c-9fa7-42d8-a071-e4972b688509-kube-api-access-mdt4j\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.494015 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/561df03c-9fa7-42d8-a071-e4972b688509-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.494053 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.494070 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.494081 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561df03c-9fa7-42d8-a071-e4972b688509-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535572 4823 generic.go:334] "Generic (PLEG): container finished" podID="561df03c-9fa7-42d8-a071-e4972b688509" containerID="4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37" exitCode=143 Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535605 4823 generic.go:334] "Generic (PLEG): container finished" podID="561df03c-9fa7-42d8-a071-e4972b688509" containerID="71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce" exitCode=143 Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535661 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"561df03c-9fa7-42d8-a071-e4972b688509","Type":"ContainerDied","Data":"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37"} Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535693 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"561df03c-9fa7-42d8-a071-e4972b688509","Type":"ContainerDied","Data":"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce"} Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535703 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"561df03c-9fa7-42d8-a071-e4972b688509","Type":"ContainerDied","Data":"422ed5fdf0b595921f061ae4052ab3d858d5a35d0d6b920223679aaf16d25cd1"} Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535703 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.535772 4823 scope.go:117] "RemoveContainer" containerID="4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.542465 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-log" containerID="cri-o://1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc" gracePeriod=30 Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.542718 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81379994-dabf-40db-b2fa-ad5e35e443e0","Type":"ContainerStarted","Data":"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da"} Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.542828 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-httpd" containerID="cri-o://4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da" gracePeriod=30 Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.574812 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.574792224 podStartE2EDuration="3.574792224s" podCreationTimestamp="2025-12-16 08:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:04.563965356 +0000 UTC m=+7423.052531499" watchObservedRunningTime="2025-12-16 08:59:04.574792224 +0000 UTC m=+7423.063358347" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.581395 4823 scope.go:117] "RemoveContainer" containerID="71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.593341 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.609100 4823 scope.go:117] "RemoveContainer" containerID="4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37" Dec 16 08:59:04 crc kubenswrapper[4823]: E1216 08:59:04.610169 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37\": container with ID starting with 4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37 not found: ID does not exist" containerID="4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.610311 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37"} err="failed to get container status \"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37\": rpc error: code = NotFound desc = could not find container \"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37\": container with ID starting with 4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37 not found: ID does not exist" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.610407 4823 scope.go:117] "RemoveContainer" containerID="71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce" Dec 16 08:59:04 crc kubenswrapper[4823]: E1216 08:59:04.611333 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce\": container with ID starting with 71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce not found: ID does not exist" containerID="71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.611590 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce"} err="failed to get container status \"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce\": rpc error: code = NotFound desc = could not find container \"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce\": container with ID starting with 71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce not found: ID does not exist" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.611659 4823 scope.go:117] "RemoveContainer" containerID="4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.611756 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.613452 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37"} err="failed to get container status \"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37\": rpc error: code = NotFound desc = could not find container \"4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37\": container with ID starting with 4fb982a19d823df914719b0290e62307ba0da136eda1b0a8b3cc4b3b3efd0e37 not found: ID does not exist" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.613560 4823 scope.go:117] "RemoveContainer" containerID="71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.619215 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce"} err="failed to get container status \"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce\": rpc error: code = NotFound desc = could not find container \"71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce\": container with ID starting with 71ab826d41ebe4dfeb45d0a33dcd24aa010f14b1210c8d22c4e730afe9904cce not found: ID does not exist" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.631145 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:04 crc kubenswrapper[4823]: E1216 08:59:04.631558 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-httpd" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.631578 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-httpd" Dec 16 08:59:04 crc kubenswrapper[4823]: E1216 08:59:04.631592 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-log" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.631599 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-log" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.631836 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-log" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.631863 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="561df03c-9fa7-42d8-a071-e4972b688509" containerName="glance-httpd" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.632854 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.635371 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.635649 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.674642 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805038 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-logs\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805158 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7c7z\" (UniqueName: \"kubernetes.io/projected/7cd6f222-5425-4965-ae37-6225b9a87af0-kube-api-access-n7c7z\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805234 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805262 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805311 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805341 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.805391 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.907823 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908453 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908501 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908527 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908565 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908634 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-logs\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908666 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7c7z\" (UniqueName: \"kubernetes.io/projected/7cd6f222-5425-4965-ae37-6225b9a87af0-kube-api-access-n7c7z\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.908411 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.910015 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-logs\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.913982 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.914955 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.915584 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.923542 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.927845 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7c7z\" (UniqueName: \"kubernetes.io/projected/7cd6f222-5425-4965-ae37-6225b9a87af0-kube-api-access-n7c7z\") pod \"glance-default-external-api-0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " pod="openstack/glance-default-external-api-0" Dec 16 08:59:04 crc kubenswrapper[4823]: I1216 08:59:04.958613 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.146420 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.315396 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-combined-ca-bundle\") pod \"81379994-dabf-40db-b2fa-ad5e35e443e0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.315648 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-logs\") pod \"81379994-dabf-40db-b2fa-ad5e35e443e0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.315670 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptwft\" (UniqueName: \"kubernetes.io/projected/81379994-dabf-40db-b2fa-ad5e35e443e0-kube-api-access-ptwft\") pod \"81379994-dabf-40db-b2fa-ad5e35e443e0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.315730 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-config-data\") pod \"81379994-dabf-40db-b2fa-ad5e35e443e0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.315799 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-scripts\") pod \"81379994-dabf-40db-b2fa-ad5e35e443e0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.315833 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-httpd-run\") pod \"81379994-dabf-40db-b2fa-ad5e35e443e0\" (UID: \"81379994-dabf-40db-b2fa-ad5e35e443e0\") " Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.316558 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "81379994-dabf-40db-b2fa-ad5e35e443e0" (UID: "81379994-dabf-40db-b2fa-ad5e35e443e0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.316594 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-logs" (OuterVolumeSpecName: "logs") pod "81379994-dabf-40db-b2fa-ad5e35e443e0" (UID: "81379994-dabf-40db-b2fa-ad5e35e443e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.323528 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81379994-dabf-40db-b2fa-ad5e35e443e0-kube-api-access-ptwft" (OuterVolumeSpecName: "kube-api-access-ptwft") pod "81379994-dabf-40db-b2fa-ad5e35e443e0" (UID: "81379994-dabf-40db-b2fa-ad5e35e443e0"). InnerVolumeSpecName "kube-api-access-ptwft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.331538 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-scripts" (OuterVolumeSpecName: "scripts") pod "81379994-dabf-40db-b2fa-ad5e35e443e0" (UID: "81379994-dabf-40db-b2fa-ad5e35e443e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.351854 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81379994-dabf-40db-b2fa-ad5e35e443e0" (UID: "81379994-dabf-40db-b2fa-ad5e35e443e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.369879 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-config-data" (OuterVolumeSpecName: "config-data") pod "81379994-dabf-40db-b2fa-ad5e35e443e0" (UID: "81379994-dabf-40db-b2fa-ad5e35e443e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.417876 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.417916 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.417931 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptwft\" (UniqueName: \"kubernetes.io/projected/81379994-dabf-40db-b2fa-ad5e35e443e0-kube-api-access-ptwft\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.417943 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.417954 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81379994-dabf-40db-b2fa-ad5e35e443e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.417962 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81379994-dabf-40db-b2fa-ad5e35e443e0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551221 4823 generic.go:334] "Generic (PLEG): container finished" podID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerID="4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da" exitCode=0 Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551250 4823 generic.go:334] "Generic (PLEG): container finished" podID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerID="1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc" exitCode=143 Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551299 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81379994-dabf-40db-b2fa-ad5e35e443e0","Type":"ContainerDied","Data":"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da"} Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551326 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551351 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81379994-dabf-40db-b2fa-ad5e35e443e0","Type":"ContainerDied","Data":"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc"} Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551433 4823 scope.go:117] "RemoveContainer" containerID="4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.551366 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81379994-dabf-40db-b2fa-ad5e35e443e0","Type":"ContainerDied","Data":"91c910172b3d33928d9d57f46fa3ece3c8ac46de682513ce3fa677b297bb5494"} Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.589949 4823 scope.go:117] "RemoveContainer" containerID="1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.621124 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.632159 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.642038 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.642535 4823 scope.go:117] "RemoveContainer" containerID="4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da" Dec 16 08:59:05 crc kubenswrapper[4823]: E1216 08:59:05.645409 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da\": container with ID starting with 4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da not found: ID does not exist" containerID="4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.645473 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da"} err="failed to get container status \"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da\": rpc error: code = NotFound desc = could not find container \"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da\": container with ID starting with 4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da not found: ID does not exist" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.645503 4823 scope.go:117] "RemoveContainer" containerID="1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc" Dec 16 08:59:05 crc kubenswrapper[4823]: E1216 08:59:05.648681 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc\": container with ID starting with 1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc not found: ID does not exist" containerID="1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.648772 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc"} err="failed to get container status \"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc\": rpc error: code = NotFound desc = could not find container \"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc\": container with ID starting with 1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc not found: ID does not exist" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.648848 4823 scope.go:117] "RemoveContainer" containerID="4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.654228 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da"} err="failed to get container status \"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da\": rpc error: code = NotFound desc = could not find container \"4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da\": container with ID starting with 4b58bec35aeca38d4840551a8e504bbd8fa6ccb64c588af17847059c35cd36da not found: ID does not exist" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.654292 4823 scope.go:117] "RemoveContainer" containerID="1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc" Dec 16 08:59:05 crc kubenswrapper[4823]: W1216 08:59:05.654638 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd6f222_5425_4965_ae37_6225b9a87af0.slice/crio-a333247072dbe13b9b4e7ca655d4b849033eddd0dc6cff8ee4e32a9579e029e6 WatchSource:0}: Error finding container a333247072dbe13b9b4e7ca655d4b849033eddd0dc6cff8ee4e32a9579e029e6: Status 404 returned error can't find the container with id a333247072dbe13b9b4e7ca655d4b849033eddd0dc6cff8ee4e32a9579e029e6 Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.654910 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc"} err="failed to get container status \"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc\": rpc error: code = NotFound desc = could not find container \"1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc\": container with ID starting with 1b774abf0f7d0746a1cc108cb030ace25512338014e0334bf7e97f3e23e77dfc not found: ID does not exist" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.671355 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:05 crc kubenswrapper[4823]: E1216 08:59:05.671918 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-httpd" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.671941 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-httpd" Dec 16 08:59:05 crc kubenswrapper[4823]: E1216 08:59:05.671971 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-log" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.671981 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-log" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.677447 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-httpd" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.677505 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" containerName="glance-log" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.679098 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.685608 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.686019 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.689236 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.787818 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561df03c-9fa7-42d8-a071-e4972b688509" path="/var/lib/kubelet/pods/561df03c-9fa7-42d8-a071-e4972b688509/volumes" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.788773 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81379994-dabf-40db-b2fa-ad5e35e443e0" path="/var/lib/kubelet/pods/81379994-dabf-40db-b2fa-ad5e35e443e0/volumes" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.831939 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfl2\" (UniqueName: \"kubernetes.io/projected/15fb6d60-bfc3-40af-b514-9cca55e1034f-kube-api-access-nwfl2\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.832074 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.832113 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.832156 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-logs\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.832199 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.832508 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.833073 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934454 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934562 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934615 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfl2\" (UniqueName: \"kubernetes.io/projected/15fb6d60-bfc3-40af-b514-9cca55e1034f-kube-api-access-nwfl2\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934656 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934680 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934700 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-logs\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.934724 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.935232 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.936203 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-logs\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.943949 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.944055 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.944532 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.944654 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:05 crc kubenswrapper[4823]: I1216 08:59:05.952995 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfl2\" (UniqueName: \"kubernetes.io/projected/15fb6d60-bfc3-40af-b514-9cca55e1034f-kube-api-access-nwfl2\") pod \"glance-default-internal-api-0\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:59:06 crc kubenswrapper[4823]: I1216 08:59:06.054145 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:06 crc kubenswrapper[4823]: I1216 08:59:06.564265 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cd6f222-5425-4965-ae37-6225b9a87af0","Type":"ContainerStarted","Data":"05d1315267b8387ccc39e7df784990aaac16fc7471cf33a43d40495a1621dc86"} Dec 16 08:59:06 crc kubenswrapper[4823]: I1216 08:59:06.564831 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cd6f222-5425-4965-ae37-6225b9a87af0","Type":"ContainerStarted","Data":"a333247072dbe13b9b4e7ca655d4b849033eddd0dc6cff8ee4e32a9579e029e6"} Dec 16 08:59:06 crc kubenswrapper[4823]: I1216 08:59:06.612918 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:59:06 crc kubenswrapper[4823]: W1216 08:59:06.623571 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15fb6d60_bfc3_40af_b514_9cca55e1034f.slice/crio-44b39aa4e66009c032e963634812228f8538d38b49def7ad11beb5eac0f7efd4 WatchSource:0}: Error finding container 44b39aa4e66009c032e963634812228f8538d38b49def7ad11beb5eac0f7efd4: Status 404 returned error can't find the container with id 44b39aa4e66009c032e963634812228f8538d38b49def7ad11beb5eac0f7efd4 Dec 16 08:59:07 crc kubenswrapper[4823]: I1216 08:59:07.578512 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cd6f222-5425-4965-ae37-6225b9a87af0","Type":"ContainerStarted","Data":"0e6e20806c08f1b0889aa8362a17ef1b5ffe809d46c93f3c54ad12f33a345cd9"} Dec 16 08:59:07 crc kubenswrapper[4823]: I1216 08:59:07.582209 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15fb6d60-bfc3-40af-b514-9cca55e1034f","Type":"ContainerStarted","Data":"afeb2bc4d654f2ce24095a64b2ca9f2aa84f48dcb968b966f6c9b662c708863d"} Dec 16 08:59:07 crc kubenswrapper[4823]: I1216 08:59:07.582250 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15fb6d60-bfc3-40af-b514-9cca55e1034f","Type":"ContainerStarted","Data":"44b39aa4e66009c032e963634812228f8538d38b49def7ad11beb5eac0f7efd4"} Dec 16 08:59:07 crc kubenswrapper[4823]: I1216 08:59:07.603402 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.603375924 podStartE2EDuration="3.603375924s" podCreationTimestamp="2025-12-16 08:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:07.597687656 +0000 UTC m=+7426.086253779" watchObservedRunningTime="2025-12-16 08:59:07.603375924 +0000 UTC m=+7426.091942047" Dec 16 08:59:08 crc kubenswrapper[4823]: I1216 08:59:08.592743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15fb6d60-bfc3-40af-b514-9cca55e1034f","Type":"ContainerStarted","Data":"d795b79a5a844c87a5782ea4b792e37ae5a545d6199ec26fe830a9788c4e32cf"} Dec 16 08:59:08 crc kubenswrapper[4823]: I1216 08:59:08.620247 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.620210374 podStartE2EDuration="3.620210374s" podCreationTimestamp="2025-12-16 08:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:08.610783518 +0000 UTC m=+7427.099349651" watchObservedRunningTime="2025-12-16 08:59:08.620210374 +0000 UTC m=+7427.108776507" Dec 16 08:59:11 crc kubenswrapper[4823]: I1216 08:59:11.290180 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:11 crc kubenswrapper[4823]: I1216 08:59:11.343589 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf66f947-6bpwj"] Dec 16 08:59:11 crc kubenswrapper[4823]: I1216 08:59:11.355396 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" containerName="dnsmasq-dns" containerID="cri-o://eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846" gracePeriod=10 Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.339212 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.462021 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-dns-svc\") pod \"23e23966-cfec-4725-b1aa-d799892ffec8\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.462235 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzjvc\" (UniqueName: \"kubernetes.io/projected/23e23966-cfec-4725-b1aa-d799892ffec8-kube-api-access-fzjvc\") pod \"23e23966-cfec-4725-b1aa-d799892ffec8\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.462272 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-sb\") pod \"23e23966-cfec-4725-b1aa-d799892ffec8\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.462300 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-nb\") pod \"23e23966-cfec-4725-b1aa-d799892ffec8\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.462343 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-config\") pod \"23e23966-cfec-4725-b1aa-d799892ffec8\" (UID: \"23e23966-cfec-4725-b1aa-d799892ffec8\") " Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.468693 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e23966-cfec-4725-b1aa-d799892ffec8-kube-api-access-fzjvc" (OuterVolumeSpecName: "kube-api-access-fzjvc") pod "23e23966-cfec-4725-b1aa-d799892ffec8" (UID: "23e23966-cfec-4725-b1aa-d799892ffec8"). InnerVolumeSpecName "kube-api-access-fzjvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.511709 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "23e23966-cfec-4725-b1aa-d799892ffec8" (UID: "23e23966-cfec-4725-b1aa-d799892ffec8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.512725 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23e23966-cfec-4725-b1aa-d799892ffec8" (UID: "23e23966-cfec-4725-b1aa-d799892ffec8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.517500 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-config" (OuterVolumeSpecName: "config") pod "23e23966-cfec-4725-b1aa-d799892ffec8" (UID: "23e23966-cfec-4725-b1aa-d799892ffec8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.526974 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23e23966-cfec-4725-b1aa-d799892ffec8" (UID: "23e23966-cfec-4725-b1aa-d799892ffec8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.564700 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzjvc\" (UniqueName: \"kubernetes.io/projected/23e23966-cfec-4725-b1aa-d799892ffec8-kube-api-access-fzjvc\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.564741 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.564768 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.564784 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.564795 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23e23966-cfec-4725-b1aa-d799892ffec8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.628508 4823 generic.go:334] "Generic (PLEG): container finished" podID="23e23966-cfec-4725-b1aa-d799892ffec8" containerID="eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846" exitCode=0 Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.628734 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" event={"ID":"23e23966-cfec-4725-b1aa-d799892ffec8","Type":"ContainerDied","Data":"eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846"} Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.628852 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" event={"ID":"23e23966-cfec-4725-b1aa-d799892ffec8","Type":"ContainerDied","Data":"eed53cf5b3a88f01275a2364ad1f3a12b4a67641c79fd11340f2e9afc124393e"} Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.628938 4823 scope.go:117] "RemoveContainer" containerID="eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.629178 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf66f947-6bpwj" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.650213 4823 scope.go:117] "RemoveContainer" containerID="ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.676747 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf66f947-6bpwj"] Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.692555 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf66f947-6bpwj"] Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.698245 4823 scope.go:117] "RemoveContainer" containerID="eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846" Dec 16 08:59:12 crc kubenswrapper[4823]: E1216 08:59:12.699135 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846\": container with ID starting with eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846 not found: ID does not exist" containerID="eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.699192 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846"} err="failed to get container status \"eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846\": rpc error: code = NotFound desc = could not find container \"eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846\": container with ID starting with eaeb0688225ae3b5197b031737e061585daffa5a177d8f52564775c8054fc846 not found: ID does not exist" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.699230 4823 scope.go:117] "RemoveContainer" containerID="ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20" Dec 16 08:59:12 crc kubenswrapper[4823]: E1216 08:59:12.699793 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20\": container with ID starting with ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20 not found: ID does not exist" containerID="ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20" Dec 16 08:59:12 crc kubenswrapper[4823]: I1216 08:59:12.699845 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20"} err="failed to get container status \"ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20\": rpc error: code = NotFound desc = could not find container \"ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20\": container with ID starting with ae817d2376b2a74027a93b051da528811300f67817e188f14c57bf5c32630c20 not found: ID does not exist" Dec 16 08:59:13 crc kubenswrapper[4823]: I1216 08:59:13.785231 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" path="/var/lib/kubelet/pods/23e23966-cfec-4725-b1aa-d799892ffec8/volumes" Dec 16 08:59:14 crc kubenswrapper[4823]: I1216 08:59:14.958899 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 08:59:14 crc kubenswrapper[4823]: I1216 08:59:14.958967 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 08:59:15 crc kubenswrapper[4823]: I1216 08:59:15.000296 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 08:59:15 crc kubenswrapper[4823]: I1216 08:59:15.005170 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 08:59:15 crc kubenswrapper[4823]: I1216 08:59:15.661604 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 08:59:15 crc kubenswrapper[4823]: I1216 08:59:15.661670 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 08:59:16 crc kubenswrapper[4823]: I1216 08:59:16.085529 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:16 crc kubenswrapper[4823]: I1216 08:59:16.093213 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:16 crc kubenswrapper[4823]: I1216 08:59:16.122842 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:16 crc kubenswrapper[4823]: I1216 08:59:16.148841 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:16 crc kubenswrapper[4823]: I1216 08:59:16.676825 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:16 crc kubenswrapper[4823]: I1216 08:59:16.677160 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:17 crc kubenswrapper[4823]: I1216 08:59:17.673953 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 08:59:17 crc kubenswrapper[4823]: I1216 08:59:17.684419 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 08:59:17 crc kubenswrapper[4823]: I1216 08:59:17.711127 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 08:59:18 crc kubenswrapper[4823]: I1216 08:59:18.644729 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:18 crc kubenswrapper[4823]: I1216 08:59:18.694982 4823 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 08:59:18 crc kubenswrapper[4823]: I1216 08:59:18.710813 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.320868 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-njwbc"] Dec 16 08:59:26 crc kubenswrapper[4823]: E1216 08:59:26.321942 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" containerName="init" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.321961 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" containerName="init" Dec 16 08:59:26 crc kubenswrapper[4823]: E1216 08:59:26.322000 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" containerName="dnsmasq-dns" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.322011 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" containerName="dnsmasq-dns" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.322345 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e23966-cfec-4725-b1aa-d799892ffec8" containerName="dnsmasq-dns" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.323081 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.332415 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-njwbc"] Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.425381 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2f37-account-create-update-27jn8"] Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.426686 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.429201 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.431902 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcj9w\" (UniqueName: \"kubernetes.io/projected/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-kube-api-access-dcj9w\") pod \"placement-db-create-njwbc\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.432169 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-operator-scripts\") pod \"placement-db-create-njwbc\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.436659 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2f37-account-create-update-27jn8"] Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.534356 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-operator-scripts\") pod \"placement-db-create-njwbc\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.534451 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvj2\" (UniqueName: \"kubernetes.io/projected/bf332295-2a27-4f19-bea3-51ca1596e5c0-kube-api-access-clvj2\") pod \"placement-2f37-account-create-update-27jn8\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.534521 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf332295-2a27-4f19-bea3-51ca1596e5c0-operator-scripts\") pod \"placement-2f37-account-create-update-27jn8\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.534575 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcj9w\" (UniqueName: \"kubernetes.io/projected/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-kube-api-access-dcj9w\") pod \"placement-db-create-njwbc\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.535588 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-operator-scripts\") pod \"placement-db-create-njwbc\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.555396 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcj9w\" (UniqueName: \"kubernetes.io/projected/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-kube-api-access-dcj9w\") pod \"placement-db-create-njwbc\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.636725 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvj2\" (UniqueName: \"kubernetes.io/projected/bf332295-2a27-4f19-bea3-51ca1596e5c0-kube-api-access-clvj2\") pod \"placement-2f37-account-create-update-27jn8\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.637105 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf332295-2a27-4f19-bea3-51ca1596e5c0-operator-scripts\") pod \"placement-2f37-account-create-update-27jn8\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.637883 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf332295-2a27-4f19-bea3-51ca1596e5c0-operator-scripts\") pod \"placement-2f37-account-create-update-27jn8\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.642345 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njwbc" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.662423 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvj2\" (UniqueName: \"kubernetes.io/projected/bf332295-2a27-4f19-bea3-51ca1596e5c0-kube-api-access-clvj2\") pod \"placement-2f37-account-create-update-27jn8\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:26 crc kubenswrapper[4823]: I1216 08:59:26.744841 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.118382 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-njwbc"] Dec 16 08:59:27 crc kubenswrapper[4823]: W1216 08:59:27.122990 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3f5794c_ab92_40a6_8e97_34a6cbda2f1c.slice/crio-98689414af5f734e47ea525f38a3f2fe757430258000ddb612e9ec050a81c2cd WatchSource:0}: Error finding container 98689414af5f734e47ea525f38a3f2fe757430258000ddb612e9ec050a81c2cd: Status 404 returned error can't find the container with id 98689414af5f734e47ea525f38a3f2fe757430258000ddb612e9ec050a81c2cd Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.227564 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2f37-account-create-update-27jn8"] Dec 16 08:59:27 crc kubenswrapper[4823]: W1216 08:59:27.229698 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf332295_2a27_4f19_bea3_51ca1596e5c0.slice/crio-c4581f3ad3329c3ddb2dcf36905de03829dca544bc2886911e4dcc7bfc4b0088 WatchSource:0}: Error finding container c4581f3ad3329c3ddb2dcf36905de03829dca544bc2886911e4dcc7bfc4b0088: Status 404 returned error can't find the container with id c4581f3ad3329c3ddb2dcf36905de03829dca544bc2886911e4dcc7bfc4b0088 Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.808495 4823 generic.go:334] "Generic (PLEG): container finished" podID="b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" containerID="4ac3dd81ea923b63025aca3825b2fb8c65b13d2c45a0c4c97d745de3e54dad1d" exitCode=0 Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.808557 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njwbc" event={"ID":"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c","Type":"ContainerDied","Data":"4ac3dd81ea923b63025aca3825b2fb8c65b13d2c45a0c4c97d745de3e54dad1d"} Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.810086 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njwbc" event={"ID":"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c","Type":"ContainerStarted","Data":"98689414af5f734e47ea525f38a3f2fe757430258000ddb612e9ec050a81c2cd"} Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.816632 4823 generic.go:334] "Generic (PLEG): container finished" podID="bf332295-2a27-4f19-bea3-51ca1596e5c0" containerID="be1087a76e2e8a3ff26a604d9bff13fbda716ff8f0816b5e30d225f77d136a04" exitCode=0 Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.816679 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2f37-account-create-update-27jn8" event={"ID":"bf332295-2a27-4f19-bea3-51ca1596e5c0","Type":"ContainerDied","Data":"be1087a76e2e8a3ff26a604d9bff13fbda716ff8f0816b5e30d225f77d136a04"} Dec 16 08:59:27 crc kubenswrapper[4823]: I1216 08:59:27.816705 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2f37-account-create-update-27jn8" event={"ID":"bf332295-2a27-4f19-bea3-51ca1596e5c0","Type":"ContainerStarted","Data":"c4581f3ad3329c3ddb2dcf36905de03829dca544bc2886911e4dcc7bfc4b0088"} Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.250354 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njwbc" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.257399 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.397424 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf332295-2a27-4f19-bea3-51ca1596e5c0-operator-scripts\") pod \"bf332295-2a27-4f19-bea3-51ca1596e5c0\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.397567 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clvj2\" (UniqueName: \"kubernetes.io/projected/bf332295-2a27-4f19-bea3-51ca1596e5c0-kube-api-access-clvj2\") pod \"bf332295-2a27-4f19-bea3-51ca1596e5c0\" (UID: \"bf332295-2a27-4f19-bea3-51ca1596e5c0\") " Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.397665 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcj9w\" (UniqueName: \"kubernetes.io/projected/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-kube-api-access-dcj9w\") pod \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.397713 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-operator-scripts\") pod \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\" (UID: \"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c\") " Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.398300 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" (UID: "b3f5794c-ab92-40a6-8e97-34a6cbda2f1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.398352 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf332295-2a27-4f19-bea3-51ca1596e5c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf332295-2a27-4f19-bea3-51ca1596e5c0" (UID: "bf332295-2a27-4f19-bea3-51ca1596e5c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.404182 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-kube-api-access-dcj9w" (OuterVolumeSpecName: "kube-api-access-dcj9w") pod "b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" (UID: "b3f5794c-ab92-40a6-8e97-34a6cbda2f1c"). InnerVolumeSpecName "kube-api-access-dcj9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.404343 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf332295-2a27-4f19-bea3-51ca1596e5c0-kube-api-access-clvj2" (OuterVolumeSpecName: "kube-api-access-clvj2") pod "bf332295-2a27-4f19-bea3-51ca1596e5c0" (UID: "bf332295-2a27-4f19-bea3-51ca1596e5c0"). InnerVolumeSpecName "kube-api-access-clvj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.499733 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clvj2\" (UniqueName: \"kubernetes.io/projected/bf332295-2a27-4f19-bea3-51ca1596e5c0-kube-api-access-clvj2\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.499791 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcj9w\" (UniqueName: \"kubernetes.io/projected/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-kube-api-access-dcj9w\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.499805 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.499817 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf332295-2a27-4f19-bea3-51ca1596e5c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.836768 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2f37-account-create-update-27jn8" event={"ID":"bf332295-2a27-4f19-bea3-51ca1596e5c0","Type":"ContainerDied","Data":"c4581f3ad3329c3ddb2dcf36905de03829dca544bc2886911e4dcc7bfc4b0088"} Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.836808 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4581f3ad3329c3ddb2dcf36905de03829dca544bc2886911e4dcc7bfc4b0088" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.836905 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2f37-account-create-update-27jn8" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.838176 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-njwbc" event={"ID":"b3f5794c-ab92-40a6-8e97-34a6cbda2f1c","Type":"ContainerDied","Data":"98689414af5f734e47ea525f38a3f2fe757430258000ddb612e9ec050a81c2cd"} Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.838286 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98689414af5f734e47ea525f38a3f2fe757430258000ddb612e9ec050a81c2cd" Dec 16 08:59:29 crc kubenswrapper[4823]: I1216 08:59:29.838230 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-njwbc" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.646665 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-585657f749-s2nbz"] Dec 16 08:59:31 crc kubenswrapper[4823]: E1216 08:59:31.647398 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf332295-2a27-4f19-bea3-51ca1596e5c0" containerName="mariadb-account-create-update" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.647416 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf332295-2a27-4f19-bea3-51ca1596e5c0" containerName="mariadb-account-create-update" Dec 16 08:59:31 crc kubenswrapper[4823]: E1216 08:59:31.647453 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" containerName="mariadb-database-create" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.647461 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" containerName="mariadb-database-create" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.647674 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" containerName="mariadb-database-create" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.647697 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf332295-2a27-4f19-bea3-51ca1596e5c0" containerName="mariadb-account-create-update" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.648860 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.667993 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-585657f749-s2nbz"] Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.717481 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2gq2s"] Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.719793 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.725290 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.725630 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.725846 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9mbcj" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.730771 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2gq2s"] Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.849530 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-sb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.849949 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkzjb\" (UniqueName: \"kubernetes.io/projected/1feea07f-4c8b-4c90-8f3f-63e810a7a525-kube-api-access-mkzjb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850094 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-config-data\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850219 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-dns-svc\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850387 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850536 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-combined-ca-bundle\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850656 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-nb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850776 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ssd7\" (UniqueName: \"kubernetes.io/projected/dbc2db84-0d65-4fef-90a7-3051d568430b-kube-api-access-8ssd7\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.850936 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-scripts\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.851080 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc2db84-0d65-4fef-90a7-3051d568430b-logs\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952494 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-scripts\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952570 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc2db84-0d65-4fef-90a7-3051d568430b-logs\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952664 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-sb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952699 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzjb\" (UniqueName: \"kubernetes.io/projected/1feea07f-4c8b-4c90-8f3f-63e810a7a525-kube-api-access-mkzjb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952714 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-config-data\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952735 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-dns-svc\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952787 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952830 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-combined-ca-bundle\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952846 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-nb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.952874 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ssd7\" (UniqueName: \"kubernetes.io/projected/dbc2db84-0d65-4fef-90a7-3051d568430b-kube-api-access-8ssd7\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.953885 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc2db84-0d65-4fef-90a7-3051d568430b-logs\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.956528 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-dns-svc\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.956683 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-sb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.956815 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-nb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.957091 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.957697 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-combined-ca-bundle\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.957945 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-config-data\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.963324 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-scripts\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.973794 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ssd7\" (UniqueName: \"kubernetes.io/projected/dbc2db84-0d65-4fef-90a7-3051d568430b-kube-api-access-8ssd7\") pod \"placement-db-sync-2gq2s\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:31 crc kubenswrapper[4823]: I1216 08:59:31.984302 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzjb\" (UniqueName: \"kubernetes.io/projected/1feea07f-4c8b-4c90-8f3f-63e810a7a525-kube-api-access-mkzjb\") pod \"dnsmasq-dns-585657f749-s2nbz\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:32 crc kubenswrapper[4823]: I1216 08:59:32.036854 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:32 crc kubenswrapper[4823]: I1216 08:59:32.282639 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:32 crc kubenswrapper[4823]: I1216 08:59:32.497055 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2gq2s"] Dec 16 08:59:32 crc kubenswrapper[4823]: I1216 08:59:32.734084 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-585657f749-s2nbz"] Dec 16 08:59:32 crc kubenswrapper[4823]: W1216 08:59:32.741045 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1feea07f_4c8b_4c90_8f3f_63e810a7a525.slice/crio-85edd95dd769f7e49d143954f68c8d5f9163d250f1893d298cc52bcb61a678f5 WatchSource:0}: Error finding container 85edd95dd769f7e49d143954f68c8d5f9163d250f1893d298cc52bcb61a678f5: Status 404 returned error can't find the container with id 85edd95dd769f7e49d143954f68c8d5f9163d250f1893d298cc52bcb61a678f5 Dec 16 08:59:32 crc kubenswrapper[4823]: I1216 08:59:32.868606 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2gq2s" event={"ID":"dbc2db84-0d65-4fef-90a7-3051d568430b","Type":"ContainerStarted","Data":"c81fb328f314fb2e6861e6fe0be608b3300c9c1992c5c54e180d6270915ca5be"} Dec 16 08:59:32 crc kubenswrapper[4823]: I1216 08:59:32.870536 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-585657f749-s2nbz" event={"ID":"1feea07f-4c8b-4c90-8f3f-63e810a7a525","Type":"ContainerStarted","Data":"85edd95dd769f7e49d143954f68c8d5f9163d250f1893d298cc52bcb61a678f5"} Dec 16 08:59:33 crc kubenswrapper[4823]: I1216 08:59:33.882002 4823 generic.go:334] "Generic (PLEG): container finished" podID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerID="5a2c0d9c13987eee26eedb4fa88a845b7ae855811ef7bff152ebec3d6e640b0c" exitCode=0 Dec 16 08:59:33 crc kubenswrapper[4823]: I1216 08:59:33.882073 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-585657f749-s2nbz" event={"ID":"1feea07f-4c8b-4c90-8f3f-63e810a7a525","Type":"ContainerDied","Data":"5a2c0d9c13987eee26eedb4fa88a845b7ae855811ef7bff152ebec3d6e640b0c"} Dec 16 08:59:34 crc kubenswrapper[4823]: I1216 08:59:34.894970 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-585657f749-s2nbz" event={"ID":"1feea07f-4c8b-4c90-8f3f-63e810a7a525","Type":"ContainerStarted","Data":"41af04c1b1d3d7fba526b58071e84a55f4d77a41c033341e492f0f1febfa391f"} Dec 16 08:59:34 crc kubenswrapper[4823]: I1216 08:59:34.895829 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:34 crc kubenswrapper[4823]: I1216 08:59:34.927597 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-585657f749-s2nbz" podStartSLOduration=3.927571611 podStartE2EDuration="3.927571611s" podCreationTimestamp="2025-12-16 08:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:34.919937392 +0000 UTC m=+7453.408503555" watchObservedRunningTime="2025-12-16 08:59:34.927571611 +0000 UTC m=+7453.416137724" Dec 16 08:59:36 crc kubenswrapper[4823]: I1216 08:59:36.931654 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2gq2s" event={"ID":"dbc2db84-0d65-4fef-90a7-3051d568430b","Type":"ContainerStarted","Data":"0b933de89f5d1527a81f1500b2cabbf12c3dcbf034af50a55cc3f912ac36c32a"} Dec 16 08:59:36 crc kubenswrapper[4823]: I1216 08:59:36.953772 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2gq2s" podStartSLOduration=2.491197425 podStartE2EDuration="5.95375285s" podCreationTimestamp="2025-12-16 08:59:31 +0000 UTC" firstStartedPulling="2025-12-16 08:59:32.504103955 +0000 UTC m=+7450.992670078" lastFinishedPulling="2025-12-16 08:59:35.96665938 +0000 UTC m=+7454.455225503" observedRunningTime="2025-12-16 08:59:36.950491819 +0000 UTC m=+7455.439057942" watchObservedRunningTime="2025-12-16 08:59:36.95375285 +0000 UTC m=+7455.442318973" Dec 16 08:59:37 crc kubenswrapper[4823]: I1216 08:59:37.941599 4823 generic.go:334] "Generic (PLEG): container finished" podID="dbc2db84-0d65-4fef-90a7-3051d568430b" containerID="0b933de89f5d1527a81f1500b2cabbf12c3dcbf034af50a55cc3f912ac36c32a" exitCode=0 Dec 16 08:59:37 crc kubenswrapper[4823]: I1216 08:59:37.941660 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2gq2s" event={"ID":"dbc2db84-0d65-4fef-90a7-3051d568430b","Type":"ContainerDied","Data":"0b933de89f5d1527a81f1500b2cabbf12c3dcbf034af50a55cc3f912ac36c32a"} Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.279625 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.398823 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ssd7\" (UniqueName: \"kubernetes.io/projected/dbc2db84-0d65-4fef-90a7-3051d568430b-kube-api-access-8ssd7\") pod \"dbc2db84-0d65-4fef-90a7-3051d568430b\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.399129 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc2db84-0d65-4fef-90a7-3051d568430b-logs\") pod \"dbc2db84-0d65-4fef-90a7-3051d568430b\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.399223 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-combined-ca-bundle\") pod \"dbc2db84-0d65-4fef-90a7-3051d568430b\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.399261 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-scripts\") pod \"dbc2db84-0d65-4fef-90a7-3051d568430b\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.399281 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-config-data\") pod \"dbc2db84-0d65-4fef-90a7-3051d568430b\" (UID: \"dbc2db84-0d65-4fef-90a7-3051d568430b\") " Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.400265 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc2db84-0d65-4fef-90a7-3051d568430b-logs" (OuterVolumeSpecName: "logs") pod "dbc2db84-0d65-4fef-90a7-3051d568430b" (UID: "dbc2db84-0d65-4fef-90a7-3051d568430b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.405128 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-scripts" (OuterVolumeSpecName: "scripts") pod "dbc2db84-0d65-4fef-90a7-3051d568430b" (UID: "dbc2db84-0d65-4fef-90a7-3051d568430b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.415376 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc2db84-0d65-4fef-90a7-3051d568430b-kube-api-access-8ssd7" (OuterVolumeSpecName: "kube-api-access-8ssd7") pod "dbc2db84-0d65-4fef-90a7-3051d568430b" (UID: "dbc2db84-0d65-4fef-90a7-3051d568430b"). InnerVolumeSpecName "kube-api-access-8ssd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.438534 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-config-data" (OuterVolumeSpecName: "config-data") pod "dbc2db84-0d65-4fef-90a7-3051d568430b" (UID: "dbc2db84-0d65-4fef-90a7-3051d568430b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.443751 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbc2db84-0d65-4fef-90a7-3051d568430b" (UID: "dbc2db84-0d65-4fef-90a7-3051d568430b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.501341 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.501389 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.501405 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbc2db84-0d65-4fef-90a7-3051d568430b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.501418 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ssd7\" (UniqueName: \"kubernetes.io/projected/dbc2db84-0d65-4fef-90a7-3051d568430b-kube-api-access-8ssd7\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.501434 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbc2db84-0d65-4fef-90a7-3051d568430b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.961535 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2gq2s" event={"ID":"dbc2db84-0d65-4fef-90a7-3051d568430b","Type":"ContainerDied","Data":"c81fb328f314fb2e6861e6fe0be608b3300c9c1992c5c54e180d6270915ca5be"} Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.961574 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c81fb328f314fb2e6861e6fe0be608b3300c9c1992c5c54e180d6270915ca5be" Dec 16 08:59:39 crc kubenswrapper[4823]: I1216 08:59:39.961627 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2gq2s" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.063509 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7454ff977b-h6fwh"] Dec 16 08:59:40 crc kubenswrapper[4823]: E1216 08:59:40.063889 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc2db84-0d65-4fef-90a7-3051d568430b" containerName="placement-db-sync" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.063905 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc2db84-0d65-4fef-90a7-3051d568430b" containerName="placement-db-sync" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.064106 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc2db84-0d65-4fef-90a7-3051d568430b" containerName="placement-db-sync" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.065062 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.067832 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.068108 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.068108 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.068143 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.068325 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9mbcj" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.081551 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7454ff977b-h6fwh"] Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213512 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-public-tls-certs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213621 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-scripts\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213654 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-internal-tls-certs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213760 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c54ba6-36e8-4608-ab54-965ab4bdcef2-logs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213791 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vvx\" (UniqueName: \"kubernetes.io/projected/44c54ba6-36e8-4608-ab54-965ab4bdcef2-kube-api-access-x9vvx\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213843 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-combined-ca-bundle\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.213869 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-config-data\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.315832 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c54ba6-36e8-4608-ab54-965ab4bdcef2-logs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.315932 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vvx\" (UniqueName: \"kubernetes.io/projected/44c54ba6-36e8-4608-ab54-965ab4bdcef2-kube-api-access-x9vvx\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.316290 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-combined-ca-bundle\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.316350 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-config-data\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.316475 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-public-tls-certs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.316612 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-scripts\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.316641 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c54ba6-36e8-4608-ab54-965ab4bdcef2-logs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.316667 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-internal-tls-certs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.320729 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-config-data\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.320828 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-public-tls-certs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.320841 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-combined-ca-bundle\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.321156 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-scripts\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.323481 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-internal-tls-certs\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.338188 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vvx\" (UniqueName: \"kubernetes.io/projected/44c54ba6-36e8-4608-ab54-965ab4bdcef2-kube-api-access-x9vvx\") pod \"placement-7454ff977b-h6fwh\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.397987 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.848768 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7454ff977b-h6fwh"] Dec 16 08:59:40 crc kubenswrapper[4823]: W1216 08:59:40.854539 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44c54ba6_36e8_4608_ab54_965ab4bdcef2.slice/crio-f799eb9828876f8d4b4204046816d340c29dff9c3e1fb6d7bfc6a096535e7ee0 WatchSource:0}: Error finding container f799eb9828876f8d4b4204046816d340c29dff9c3e1fb6d7bfc6a096535e7ee0: Status 404 returned error can't find the container with id f799eb9828876f8d4b4204046816d340c29dff9c3e1fb6d7bfc6a096535e7ee0 Dec 16 08:59:40 crc kubenswrapper[4823]: I1216 08:59:40.970095 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7454ff977b-h6fwh" event={"ID":"44c54ba6-36e8-4608-ab54-965ab4bdcef2","Type":"ContainerStarted","Data":"f799eb9828876f8d4b4204046816d340c29dff9c3e1fb6d7bfc6a096535e7ee0"} Dec 16 08:59:41 crc kubenswrapper[4823]: I1216 08:59:41.983914 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7454ff977b-h6fwh" event={"ID":"44c54ba6-36e8-4608-ab54-965ab4bdcef2","Type":"ContainerStarted","Data":"651f28a2c721b5b4308bee72f9032a131e2c7f4a064121891960b81b54b65133"} Dec 16 08:59:41 crc kubenswrapper[4823]: I1216 08:59:41.984383 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7454ff977b-h6fwh" event={"ID":"44c54ba6-36e8-4608-ab54-965ab4bdcef2","Type":"ContainerStarted","Data":"4e552140d312fcdfa52ae99bb54947c323a559bd5e4b943aed566e48f1890450"} Dec 16 08:59:41 crc kubenswrapper[4823]: I1216 08:59:41.984412 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.015292 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7454ff977b-h6fwh" podStartSLOduration=2.015270239 podStartE2EDuration="2.015270239s" podCreationTimestamp="2025-12-16 08:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:59:42.009414447 +0000 UTC m=+7460.497980570" watchObservedRunningTime="2025-12-16 08:59:42.015270239 +0000 UTC m=+7460.503836362" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.285815 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.380074 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98894d689-8clmb"] Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.380376 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98894d689-8clmb" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerName="dnsmasq-dns" containerID="cri-o://537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b" gracePeriod=10 Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.911276 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.997445 4823 generic.go:334] "Generic (PLEG): container finished" podID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerID="537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b" exitCode=0 Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.997503 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98894d689-8clmb" event={"ID":"4b6a99be-d903-4ce7-9832-6a085da5277e","Type":"ContainerDied","Data":"537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b"} Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.997625 4823 scope.go:117] "RemoveContainer" containerID="537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.997530 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98894d689-8clmb" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.998235 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 08:59:42 crc kubenswrapper[4823]: I1216 08:59:42.998266 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98894d689-8clmb" event={"ID":"4b6a99be-d903-4ce7-9832-6a085da5277e","Type":"ContainerDied","Data":"160a60b436a004b1393bf0b5f79e01488966598514f641ca2a602b1440de9233"} Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.030850 4823 scope.go:117] "RemoveContainer" containerID="f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.058140 4823 scope.go:117] "RemoveContainer" containerID="537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b" Dec 16 08:59:43 crc kubenswrapper[4823]: E1216 08:59:43.058693 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b\": container with ID starting with 537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b not found: ID does not exist" containerID="537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.058740 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b"} err="failed to get container status \"537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b\": rpc error: code = NotFound desc = could not find container \"537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b\": container with ID starting with 537d03da510a65be1e8420aea5d88eef7274799ab03e9f947a722efe4d26528b not found: ID does not exist" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.058769 4823 scope.go:117] "RemoveContainer" containerID="f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c" Dec 16 08:59:43 crc kubenswrapper[4823]: E1216 08:59:43.059371 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c\": container with ID starting with f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c not found: ID does not exist" containerID="f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.059441 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c"} err="failed to get container status \"f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c\": rpc error: code = NotFound desc = could not find container \"f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c\": container with ID starting with f6c541da306bf170f745b7e97ee82095ec9cb19fd75fca5f56caad85ef0cc10c not found: ID does not exist" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.090240 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-dns-svc\") pod \"4b6a99be-d903-4ce7-9832-6a085da5277e\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.090297 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-config\") pod \"4b6a99be-d903-4ce7-9832-6a085da5277e\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.090415 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-sb\") pod \"4b6a99be-d903-4ce7-9832-6a085da5277e\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.090495 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-nb\") pod \"4b6a99be-d903-4ce7-9832-6a085da5277e\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.090534 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmgzb\" (UniqueName: \"kubernetes.io/projected/4b6a99be-d903-4ce7-9832-6a085da5277e-kube-api-access-pmgzb\") pod \"4b6a99be-d903-4ce7-9832-6a085da5277e\" (UID: \"4b6a99be-d903-4ce7-9832-6a085da5277e\") " Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.095736 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6a99be-d903-4ce7-9832-6a085da5277e-kube-api-access-pmgzb" (OuterVolumeSpecName: "kube-api-access-pmgzb") pod "4b6a99be-d903-4ce7-9832-6a085da5277e" (UID: "4b6a99be-d903-4ce7-9832-6a085da5277e"). InnerVolumeSpecName "kube-api-access-pmgzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.135922 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-config" (OuterVolumeSpecName: "config") pod "4b6a99be-d903-4ce7-9832-6a085da5277e" (UID: "4b6a99be-d903-4ce7-9832-6a085da5277e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.138410 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b6a99be-d903-4ce7-9832-6a085da5277e" (UID: "4b6a99be-d903-4ce7-9832-6a085da5277e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.145848 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b6a99be-d903-4ce7-9832-6a085da5277e" (UID: "4b6a99be-d903-4ce7-9832-6a085da5277e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.149534 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b6a99be-d903-4ce7-9832-6a085da5277e" (UID: "4b6a99be-d903-4ce7-9832-6a085da5277e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.193526 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.193564 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmgzb\" (UniqueName: \"kubernetes.io/projected/4b6a99be-d903-4ce7-9832-6a085da5277e-kube-api-access-pmgzb\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.193576 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.193584 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.193595 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b6a99be-d903-4ce7-9832-6a085da5277e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.342767 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98894d689-8clmb"] Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.355459 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98894d689-8clmb"] Dec 16 08:59:43 crc kubenswrapper[4823]: I1216 08:59:43.781289 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" path="/var/lib/kubelet/pods/4b6a99be-d903-4ce7-9832-6a085da5277e/volumes" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.174775 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt"] Dec 16 09:00:00 crc kubenswrapper[4823]: E1216 09:00:00.175858 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerName="init" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.175880 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerName="init" Dec 16 09:00:00 crc kubenswrapper[4823]: E1216 09:00:00.175923 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerName="dnsmasq-dns" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.175935 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerName="dnsmasq-dns" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.176279 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6a99be-d903-4ce7-9832-6a085da5277e" containerName="dnsmasq-dns" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.177259 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.179406 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.180075 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.199234 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt"] Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.242674 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-secret-volume\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.242722 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-config-volume\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.242827 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnd6j\" (UniqueName: \"kubernetes.io/projected/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-kube-api-access-vnd6j\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.344540 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnd6j\" (UniqueName: \"kubernetes.io/projected/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-kube-api-access-vnd6j\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.344787 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-secret-volume\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.344830 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-config-volume\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.346536 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-config-volume\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.353843 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-secret-volume\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.359258 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnd6j\" (UniqueName: \"kubernetes.io/projected/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-kube-api-access-vnd6j\") pod \"collect-profiles-29431260-c76bt\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:00 crc kubenswrapper[4823]: I1216 09:00:00.508304 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:01 crc kubenswrapper[4823]: I1216 09:00:01.030236 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt"] Dec 16 09:00:01 crc kubenswrapper[4823]: I1216 09:00:01.152666 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" event={"ID":"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc","Type":"ContainerStarted","Data":"65fa35476b73eebb8fde285d7d53208bb707ec7333b91896d66710ac5251af42"} Dec 16 09:00:02 crc kubenswrapper[4823]: I1216 09:00:02.161248 4823 generic.go:334] "Generic (PLEG): container finished" podID="6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" containerID="7015afe1684cfa883cef993ad039463f7e907a5dec61065f6dfd9f45ec97dcce" exitCode=0 Dec 16 09:00:02 crc kubenswrapper[4823]: I1216 09:00:02.161456 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" event={"ID":"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc","Type":"ContainerDied","Data":"7015afe1684cfa883cef993ad039463f7e907a5dec61065f6dfd9f45ec97dcce"} Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.585451 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.709710 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnd6j\" (UniqueName: \"kubernetes.io/projected/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-kube-api-access-vnd6j\") pod \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.709921 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-secret-volume\") pod \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.710150 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-config-volume\") pod \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\" (UID: \"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc\") " Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.711533 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" (UID: "6a543059-79b1-4ea7-8df7-4ae34bbb3fcc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.715901 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" (UID: "6a543059-79b1-4ea7-8df7-4ae34bbb3fcc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.716644 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-kube-api-access-vnd6j" (OuterVolumeSpecName: "kube-api-access-vnd6j") pod "6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" (UID: "6a543059-79b1-4ea7-8df7-4ae34bbb3fcc"). InnerVolumeSpecName "kube-api-access-vnd6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.812332 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnd6j\" (UniqueName: \"kubernetes.io/projected/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-kube-api-access-vnd6j\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.812382 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:03 crc kubenswrapper[4823]: I1216 09:00:03.812408 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a543059-79b1-4ea7-8df7-4ae34bbb3fcc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:04 crc kubenswrapper[4823]: I1216 09:00:04.189451 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" event={"ID":"6a543059-79b1-4ea7-8df7-4ae34bbb3fcc","Type":"ContainerDied","Data":"65fa35476b73eebb8fde285d7d53208bb707ec7333b91896d66710ac5251af42"} Dec 16 09:00:04 crc kubenswrapper[4823]: I1216 09:00:04.189486 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65fa35476b73eebb8fde285d7d53208bb707ec7333b91896d66710ac5251af42" Dec 16 09:00:04 crc kubenswrapper[4823]: I1216 09:00:04.189518 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-c76bt" Dec 16 09:00:04 crc kubenswrapper[4823]: I1216 09:00:04.681685 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m"] Dec 16 09:00:04 crc kubenswrapper[4823]: I1216 09:00:04.689070 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-45r5m"] Dec 16 09:00:05 crc kubenswrapper[4823]: I1216 09:00:05.781148 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb" path="/var/lib/kubelet/pods/67e6e2f2-898a-4a31-aa9a-b95c8dcfd0bb/volumes" Dec 16 09:00:11 crc kubenswrapper[4823]: I1216 09:00:11.515254 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 09:00:11 crc kubenswrapper[4823]: I1216 09:00:11.517840 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 09:00:28 crc kubenswrapper[4823]: I1216 09:00:28.134273 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:00:28 crc kubenswrapper[4823]: I1216 09:00:28.134823 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.831229 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xzsbb"] Dec 16 09:00:35 crc kubenswrapper[4823]: E1216 09:00:35.832320 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" containerName="collect-profiles" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.832345 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" containerName="collect-profiles" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.832566 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a543059-79b1-4ea7-8df7-4ae34bbb3fcc" containerName="collect-profiles" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.833310 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.841172 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xzsbb"] Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.928781 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mg742"] Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.933543 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.946680 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de9c6b0-50af-4139-b54c-eed47d4d804b-operator-scripts\") pod \"nova-api-db-create-xzsbb\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.946724 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k6ct\" (UniqueName: \"kubernetes.io/projected/4de9c6b0-50af-4139-b54c-eed47d4d804b-kube-api-access-4k6ct\") pod \"nova-api-db-create-xzsbb\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:35 crc kubenswrapper[4823]: I1216 09:00:35.961264 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mg742"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.064283 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69ttx\" (UniqueName: \"kubernetes.io/projected/532985ed-811b-4627-80e7-4278a094a3df-kube-api-access-69ttx\") pod \"nova-cell0-db-create-mg742\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.064617 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532985ed-811b-4627-80e7-4278a094a3df-operator-scripts\") pod \"nova-cell0-db-create-mg742\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.064804 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de9c6b0-50af-4139-b54c-eed47d4d804b-operator-scripts\") pod \"nova-api-db-create-xzsbb\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.064861 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k6ct\" (UniqueName: \"kubernetes.io/projected/4de9c6b0-50af-4139-b54c-eed47d4d804b-kube-api-access-4k6ct\") pod \"nova-api-db-create-xzsbb\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.066064 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de9c6b0-50af-4139-b54c-eed47d4d804b-operator-scripts\") pod \"nova-api-db-create-xzsbb\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.074860 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f251-account-create-update-gmqx7"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.076700 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.088219 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.115923 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f251-account-create-update-gmqx7"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.146876 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k6ct\" (UniqueName: \"kubernetes.io/projected/4de9c6b0-50af-4139-b54c-eed47d4d804b-kube-api-access-4k6ct\") pod \"nova-api-db-create-xzsbb\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.182288 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.183854 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pddl\" (UniqueName: \"kubernetes.io/projected/43a0e7a5-3a19-4ec3-9e34-be161590ae54-kube-api-access-9pddl\") pod \"nova-api-f251-account-create-update-gmqx7\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.183942 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532985ed-811b-4627-80e7-4278a094a3df-operator-scripts\") pod \"nova-cell0-db-create-mg742\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.184273 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69ttx\" (UniqueName: \"kubernetes.io/projected/532985ed-811b-4627-80e7-4278a094a3df-kube-api-access-69ttx\") pod \"nova-cell0-db-create-mg742\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.184314 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0e7a5-3a19-4ec3-9e34-be161590ae54-operator-scripts\") pod \"nova-api-f251-account-create-update-gmqx7\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.185306 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532985ed-811b-4627-80e7-4278a094a3df-operator-scripts\") pod \"nova-cell0-db-create-mg742\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.198641 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-m8qt2"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.200172 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.243112 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m8qt2"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.263737 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69ttx\" (UniqueName: \"kubernetes.io/projected/532985ed-811b-4627-80e7-4278a094a3df-kube-api-access-69ttx\") pod \"nova-cell0-db-create-mg742\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.285744 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0e7a5-3a19-4ec3-9e34-be161590ae54-operator-scripts\") pod \"nova-api-f251-account-create-update-gmqx7\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.285796 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfhq9\" (UniqueName: \"kubernetes.io/projected/8305f2ac-41bf-438a-a905-22723822520b-kube-api-access-xfhq9\") pod \"nova-cell1-db-create-m8qt2\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.286397 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pddl\" (UniqueName: \"kubernetes.io/projected/43a0e7a5-3a19-4ec3-9e34-be161590ae54-kube-api-access-9pddl\") pod \"nova-api-f251-account-create-update-gmqx7\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.286463 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8305f2ac-41bf-438a-a905-22723822520b-operator-scripts\") pod \"nova-cell1-db-create-m8qt2\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.286495 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0e7a5-3a19-4ec3-9e34-be161590ae54-operator-scripts\") pod \"nova-api-f251-account-create-update-gmqx7\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.307981 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pddl\" (UniqueName: \"kubernetes.io/projected/43a0e7a5-3a19-4ec3-9e34-be161590ae54-kube-api-access-9pddl\") pod \"nova-api-f251-account-create-update-gmqx7\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.377373 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-da3b-account-create-update-lmt62"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.383165 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.385161 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.388226 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfhq9\" (UniqueName: \"kubernetes.io/projected/8305f2ac-41bf-438a-a905-22723822520b-kube-api-access-xfhq9\") pod \"nova-cell1-db-create-m8qt2\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.388390 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8305f2ac-41bf-438a-a905-22723822520b-operator-scripts\") pod \"nova-cell1-db-create-m8qt2\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.389528 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8305f2ac-41bf-438a-a905-22723822520b-operator-scripts\") pod \"nova-cell1-db-create-m8qt2\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.405909 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-da3b-account-create-update-lmt62"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.429169 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfhq9\" (UniqueName: \"kubernetes.io/projected/8305f2ac-41bf-438a-a905-22723822520b-kube-api-access-xfhq9\") pod \"nova-cell1-db-create-m8qt2\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.434773 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.488361 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2d63-account-create-update-xv7fc"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.489825 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.489827 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl5r7\" (UniqueName: \"kubernetes.io/projected/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-kube-api-access-tl5r7\") pod \"nova-cell0-da3b-account-create-update-lmt62\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.490087 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-operator-scripts\") pod \"nova-cell0-da3b-account-create-update-lmt62\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.492350 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.496638 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2d63-account-create-update-xv7fc"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.562787 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.592285 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-operator-scripts\") pod \"nova-cell0-da3b-account-create-update-lmt62\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.592363 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl5r7\" (UniqueName: \"kubernetes.io/projected/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-kube-api-access-tl5r7\") pod \"nova-cell0-da3b-account-create-update-lmt62\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.592404 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c28ab01-359f-4508-b5f3-a3847760477b-operator-scripts\") pod \"nova-cell1-2d63-account-create-update-xv7fc\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.592420 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8qsm\" (UniqueName: \"kubernetes.io/projected/0c28ab01-359f-4508-b5f3-a3847760477b-kube-api-access-m8qsm\") pod \"nova-cell1-2d63-account-create-update-xv7fc\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.593225 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-operator-scripts\") pod \"nova-cell0-da3b-account-create-update-lmt62\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.608303 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl5r7\" (UniqueName: \"kubernetes.io/projected/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-kube-api-access-tl5r7\") pod \"nova-cell0-da3b-account-create-update-lmt62\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.685053 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.694294 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c28ab01-359f-4508-b5f3-a3847760477b-operator-scripts\") pod \"nova-cell1-2d63-account-create-update-xv7fc\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.694341 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8qsm\" (UniqueName: \"kubernetes.io/projected/0c28ab01-359f-4508-b5f3-a3847760477b-kube-api-access-m8qsm\") pod \"nova-cell1-2d63-account-create-update-xv7fc\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.695274 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c28ab01-359f-4508-b5f3-a3847760477b-operator-scripts\") pod \"nova-cell1-2d63-account-create-update-xv7fc\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.715540 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8qsm\" (UniqueName: \"kubernetes.io/projected/0c28ab01-359f-4508-b5f3-a3847760477b-kube-api-access-m8qsm\") pod \"nova-cell1-2d63-account-create-update-xv7fc\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.755114 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.826792 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.849062 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xzsbb"] Dec 16 09:00:36 crc kubenswrapper[4823]: I1216 09:00:36.971249 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f251-account-create-update-gmqx7"] Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.184288 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mg742"] Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.344119 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m8qt2"] Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.435501 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-da3b-account-create-update-lmt62"] Dec 16 09:00:37 crc kubenswrapper[4823]: W1216 09:00:37.447076 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod372f1ad6_80bd_4662_bf84_5c4bcc44bf05.slice/crio-b02ffe6c893fe3dca1d02202f1052a5c4621f3ce25ddc4cb926eee0ecc680fc0 WatchSource:0}: Error finding container b02ffe6c893fe3dca1d02202f1052a5c4621f3ce25ddc4cb926eee0ecc680fc0: Status 404 returned error can't find the container with id b02ffe6c893fe3dca1d02202f1052a5c4621f3ce25ddc4cb926eee0ecc680fc0 Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.570564 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mg742" event={"ID":"532985ed-811b-4627-80e7-4278a094a3df","Type":"ContainerStarted","Data":"76c17465710373e755060ebffa570f5b02663669f8fc7f0238f16aa0ae8f7b64"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.573826 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m8qt2" event={"ID":"8305f2ac-41bf-438a-a905-22723822520b","Type":"ContainerStarted","Data":"e12fbb2b3d2be5226e8359516edeb4f1b6aef4e94650375c02b2367963933278"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.577389 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzsbb" event={"ID":"4de9c6b0-50af-4139-b54c-eed47d4d804b","Type":"ContainerStarted","Data":"060a0b09006700be9528649380beeefb74fc9d2c640efce061eb39010f961500"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.577635 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzsbb" event={"ID":"4de9c6b0-50af-4139-b54c-eed47d4d804b","Type":"ContainerStarted","Data":"45e702a843fd46e98e878a05e5726d16aac7c5fea524000c92ed9f971b832c84"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.579989 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-update-gmqx7" event={"ID":"43a0e7a5-3a19-4ec3-9e34-be161590ae54","Type":"ContainerStarted","Data":"08dfa15970f1413699d595c2abe355ad1397b3b689d9e7b7983f87b5510ff431"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.580077 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-update-gmqx7" event={"ID":"43a0e7a5-3a19-4ec3-9e34-be161590ae54","Type":"ContainerStarted","Data":"5f3edb0b1d412788a213693a9bd8361a74b39a251a6889fc94f7962a2aeada8f"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.586585 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" event={"ID":"372f1ad6-80bd-4662-bf84-5c4bcc44bf05","Type":"ContainerStarted","Data":"b02ffe6c893fe3dca1d02202f1052a5c4621f3ce25ddc4cb926eee0ecc680fc0"} Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.595459 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2d63-account-create-update-xv7fc"] Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.608639 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-xzsbb" podStartSLOduration=2.608619202 podStartE2EDuration="2.608619202s" podCreationTimestamp="2025-12-16 09:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:00:37.599762005 +0000 UTC m=+7516.088328128" watchObservedRunningTime="2025-12-16 09:00:37.608619202 +0000 UTC m=+7516.097185325" Dec 16 09:00:37 crc kubenswrapper[4823]: I1216 09:00:37.616253 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-f251-account-create-update-gmqx7" podStartSLOduration=1.61623122 podStartE2EDuration="1.61623122s" podCreationTimestamp="2025-12-16 09:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:00:37.613730142 +0000 UTC m=+7516.102296265" watchObservedRunningTime="2025-12-16 09:00:37.61623122 +0000 UTC m=+7516.104797343" Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.597674 4823 generic.go:334] "Generic (PLEG): container finished" podID="0c28ab01-359f-4508-b5f3-a3847760477b" containerID="a3bbf522e30ba3200c6502976e8dcf2d7d8705e3ff26e042131a180e49743735" exitCode=0 Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.597747 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" event={"ID":"0c28ab01-359f-4508-b5f3-a3847760477b","Type":"ContainerDied","Data":"a3bbf522e30ba3200c6502976e8dcf2d7d8705e3ff26e042131a180e49743735"} Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.597926 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" event={"ID":"0c28ab01-359f-4508-b5f3-a3847760477b","Type":"ContainerStarted","Data":"8e0714bbe429a6f355dd77d1023f30d14bd5bd018dd424c404e86afec449d9b3"} Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.601487 4823 generic.go:334] "Generic (PLEG): container finished" podID="372f1ad6-80bd-4662-bf84-5c4bcc44bf05" containerID="655e9221caef9a480d953c8348a9e0a12e5b4531ae895118829d538e97651328" exitCode=0 Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.601636 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" event={"ID":"372f1ad6-80bd-4662-bf84-5c4bcc44bf05","Type":"ContainerDied","Data":"655e9221caef9a480d953c8348a9e0a12e5b4531ae895118829d538e97651328"} Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.604799 4823 generic.go:334] "Generic (PLEG): container finished" podID="532985ed-811b-4627-80e7-4278a094a3df" containerID="8274757ab3baaadb78d1a08566ebe5b72a16a191a1a65c3e49305283e97c88d9" exitCode=0 Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.605097 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mg742" event={"ID":"532985ed-811b-4627-80e7-4278a094a3df","Type":"ContainerDied","Data":"8274757ab3baaadb78d1a08566ebe5b72a16a191a1a65c3e49305283e97c88d9"} Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.606811 4823 generic.go:334] "Generic (PLEG): container finished" podID="8305f2ac-41bf-438a-a905-22723822520b" containerID="b5a273eb68382c7731e4aadb66e3a17bbaf04776d1f2ead198cb3317ce05d386" exitCode=0 Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.606880 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m8qt2" event={"ID":"8305f2ac-41bf-438a-a905-22723822520b","Type":"ContainerDied","Data":"b5a273eb68382c7731e4aadb66e3a17bbaf04776d1f2ead198cb3317ce05d386"} Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.613560 4823 generic.go:334] "Generic (PLEG): container finished" podID="4de9c6b0-50af-4139-b54c-eed47d4d804b" containerID="060a0b09006700be9528649380beeefb74fc9d2c640efce061eb39010f961500" exitCode=0 Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.613599 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzsbb" event={"ID":"4de9c6b0-50af-4139-b54c-eed47d4d804b","Type":"ContainerDied","Data":"060a0b09006700be9528649380beeefb74fc9d2c640efce061eb39010f961500"} Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.621513 4823 generic.go:334] "Generic (PLEG): container finished" podID="43a0e7a5-3a19-4ec3-9e34-be161590ae54" containerID="08dfa15970f1413699d595c2abe355ad1397b3b689d9e7b7983f87b5510ff431" exitCode=0 Dec 16 09:00:38 crc kubenswrapper[4823]: I1216 09:00:38.621560 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-update-gmqx7" event={"ID":"43a0e7a5-3a19-4ec3-9e34-be161590ae54","Type":"ContainerDied","Data":"08dfa15970f1413699d595c2abe355ad1397b3b689d9e7b7983f87b5510ff431"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.293717 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.383612 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl5r7\" (UniqueName: \"kubernetes.io/projected/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-kube-api-access-tl5r7\") pod \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.383741 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-operator-scripts\") pod \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\" (UID: \"372f1ad6-80bd-4662-bf84-5c4bcc44bf05\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.384721 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "372f1ad6-80bd-4662-bf84-5c4bcc44bf05" (UID: "372f1ad6-80bd-4662-bf84-5c4bcc44bf05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.391220 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-kube-api-access-tl5r7" (OuterVolumeSpecName: "kube-api-access-tl5r7") pod "372f1ad6-80bd-4662-bf84-5c4bcc44bf05" (UID: "372f1ad6-80bd-4662-bf84-5c4bcc44bf05"). InnerVolumeSpecName "kube-api-access-tl5r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.487265 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl5r7\" (UniqueName: \"kubernetes.io/projected/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-kube-api-access-tl5r7\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.487318 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/372f1ad6-80bd-4662-bf84-5c4bcc44bf05-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.502330 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.509704 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.515602 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.523587 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.532481 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.588818 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8qsm\" (UniqueName: \"kubernetes.io/projected/0c28ab01-359f-4508-b5f3-a3847760477b-kube-api-access-m8qsm\") pod \"0c28ab01-359f-4508-b5f3-a3847760477b\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.588871 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8305f2ac-41bf-438a-a905-22723822520b-operator-scripts\") pod \"8305f2ac-41bf-438a-a905-22723822520b\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.588941 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0e7a5-3a19-4ec3-9e34-be161590ae54-operator-scripts\") pod \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.588981 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69ttx\" (UniqueName: \"kubernetes.io/projected/532985ed-811b-4627-80e7-4278a094a3df-kube-api-access-69ttx\") pod \"532985ed-811b-4627-80e7-4278a094a3df\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.589084 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c28ab01-359f-4508-b5f3-a3847760477b-operator-scripts\") pod \"0c28ab01-359f-4508-b5f3-a3847760477b\" (UID: \"0c28ab01-359f-4508-b5f3-a3847760477b\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.589186 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfhq9\" (UniqueName: \"kubernetes.io/projected/8305f2ac-41bf-438a-a905-22723822520b-kube-api-access-xfhq9\") pod \"8305f2ac-41bf-438a-a905-22723822520b\" (UID: \"8305f2ac-41bf-438a-a905-22723822520b\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.589208 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de9c6b0-50af-4139-b54c-eed47d4d804b-operator-scripts\") pod \"4de9c6b0-50af-4139-b54c-eed47d4d804b\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.589237 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pddl\" (UniqueName: \"kubernetes.io/projected/43a0e7a5-3a19-4ec3-9e34-be161590ae54-kube-api-access-9pddl\") pod \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\" (UID: \"43a0e7a5-3a19-4ec3-9e34-be161590ae54\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.589265 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532985ed-811b-4627-80e7-4278a094a3df-operator-scripts\") pod \"532985ed-811b-4627-80e7-4278a094a3df\" (UID: \"532985ed-811b-4627-80e7-4278a094a3df\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.589301 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k6ct\" (UniqueName: \"kubernetes.io/projected/4de9c6b0-50af-4139-b54c-eed47d4d804b-kube-api-access-4k6ct\") pod \"4de9c6b0-50af-4139-b54c-eed47d4d804b\" (UID: \"4de9c6b0-50af-4139-b54c-eed47d4d804b\") " Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.590769 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de9c6b0-50af-4139-b54c-eed47d4d804b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4de9c6b0-50af-4139-b54c-eed47d4d804b" (UID: "4de9c6b0-50af-4139-b54c-eed47d4d804b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.591058 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c28ab01-359f-4508-b5f3-a3847760477b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c28ab01-359f-4508-b5f3-a3847760477b" (UID: "0c28ab01-359f-4508-b5f3-a3847760477b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.591494 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8305f2ac-41bf-438a-a905-22723822520b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8305f2ac-41bf-438a-a905-22723822520b" (UID: "8305f2ac-41bf-438a-a905-22723822520b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.591552 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532985ed-811b-4627-80e7-4278a094a3df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "532985ed-811b-4627-80e7-4278a094a3df" (UID: "532985ed-811b-4627-80e7-4278a094a3df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.591675 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a0e7a5-3a19-4ec3-9e34-be161590ae54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43a0e7a5-3a19-4ec3-9e34-be161590ae54" (UID: "43a0e7a5-3a19-4ec3-9e34-be161590ae54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.592844 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c28ab01-359f-4508-b5f3-a3847760477b-kube-api-access-m8qsm" (OuterVolumeSpecName: "kube-api-access-m8qsm") pod "0c28ab01-359f-4508-b5f3-a3847760477b" (UID: "0c28ab01-359f-4508-b5f3-a3847760477b"). InnerVolumeSpecName "kube-api-access-m8qsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.594186 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532985ed-811b-4627-80e7-4278a094a3df-kube-api-access-69ttx" (OuterVolumeSpecName: "kube-api-access-69ttx") pod "532985ed-811b-4627-80e7-4278a094a3df" (UID: "532985ed-811b-4627-80e7-4278a094a3df"). InnerVolumeSpecName "kube-api-access-69ttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.596335 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de9c6b0-50af-4139-b54c-eed47d4d804b-kube-api-access-4k6ct" (OuterVolumeSpecName: "kube-api-access-4k6ct") pod "4de9c6b0-50af-4139-b54c-eed47d4d804b" (UID: "4de9c6b0-50af-4139-b54c-eed47d4d804b"). InnerVolumeSpecName "kube-api-access-4k6ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.605134 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8305f2ac-41bf-438a-a905-22723822520b-kube-api-access-xfhq9" (OuterVolumeSpecName: "kube-api-access-xfhq9") pod "8305f2ac-41bf-438a-a905-22723822520b" (UID: "8305f2ac-41bf-438a-a905-22723822520b"). InnerVolumeSpecName "kube-api-access-xfhq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.605463 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a0e7a5-3a19-4ec3-9e34-be161590ae54-kube-api-access-9pddl" (OuterVolumeSpecName: "kube-api-access-9pddl") pod "43a0e7a5-3a19-4ec3-9e34-be161590ae54" (UID: "43a0e7a5-3a19-4ec3-9e34-be161590ae54"). InnerVolumeSpecName "kube-api-access-9pddl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.652851 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xzsbb" event={"ID":"4de9c6b0-50af-4139-b54c-eed47d4d804b","Type":"ContainerDied","Data":"45e702a843fd46e98e878a05e5726d16aac7c5fea524000c92ed9f971b832c84"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.653288 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45e702a843fd46e98e878a05e5726d16aac7c5fea524000c92ed9f971b832c84" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.653373 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xzsbb" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.664319 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f251-account-create-update-gmqx7" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.664327 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f251-account-create-update-gmqx7" event={"ID":"43a0e7a5-3a19-4ec3-9e34-be161590ae54","Type":"ContainerDied","Data":"5f3edb0b1d412788a213693a9bd8361a74b39a251a6889fc94f7962a2aeada8f"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.664825 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3edb0b1d412788a213693a9bd8361a74b39a251a6889fc94f7962a2aeada8f" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.669144 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" event={"ID":"0c28ab01-359f-4508-b5f3-a3847760477b","Type":"ContainerDied","Data":"8e0714bbe429a6f355dd77d1023f30d14bd5bd018dd424c404e86afec449d9b3"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.669188 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e0714bbe429a6f355dd77d1023f30d14bd5bd018dd424c404e86afec449d9b3" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.669182 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2d63-account-create-update-xv7fc" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.673599 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" event={"ID":"372f1ad6-80bd-4662-bf84-5c4bcc44bf05","Type":"ContainerDied","Data":"b02ffe6c893fe3dca1d02202f1052a5c4621f3ce25ddc4cb926eee0ecc680fc0"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.673636 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b02ffe6c893fe3dca1d02202f1052a5c4621f3ce25ddc4cb926eee0ecc680fc0" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.673680 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-da3b-account-create-update-lmt62" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.688059 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mg742" event={"ID":"532985ed-811b-4627-80e7-4278a094a3df","Type":"ContainerDied","Data":"76c17465710373e755060ebffa570f5b02663669f8fc7f0238f16aa0ae8f7b64"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.688253 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c17465710373e755060ebffa570f5b02663669f8fc7f0238f16aa0ae8f7b64" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.688460 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mg742" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.692988 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfhq9\" (UniqueName: \"kubernetes.io/projected/8305f2ac-41bf-438a-a905-22723822520b-kube-api-access-xfhq9\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.694697 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m8qt2" event={"ID":"8305f2ac-41bf-438a-a905-22723822520b","Type":"ContainerDied","Data":"e12fbb2b3d2be5226e8359516edeb4f1b6aef4e94650375c02b2367963933278"} Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.694739 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8qt2" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.694744 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e12fbb2b3d2be5226e8359516edeb4f1b6aef4e94650375c02b2367963933278" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695233 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de9c6b0-50af-4139-b54c-eed47d4d804b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695473 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pddl\" (UniqueName: \"kubernetes.io/projected/43a0e7a5-3a19-4ec3-9e34-be161590ae54-kube-api-access-9pddl\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695509 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/532985ed-811b-4627-80e7-4278a094a3df-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695522 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k6ct\" (UniqueName: \"kubernetes.io/projected/4de9c6b0-50af-4139-b54c-eed47d4d804b-kube-api-access-4k6ct\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695537 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8qsm\" (UniqueName: \"kubernetes.io/projected/0c28ab01-359f-4508-b5f3-a3847760477b-kube-api-access-m8qsm\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695550 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8305f2ac-41bf-438a-a905-22723822520b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695565 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0e7a5-3a19-4ec3-9e34-be161590ae54-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695578 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69ttx\" (UniqueName: \"kubernetes.io/projected/532985ed-811b-4627-80e7-4278a094a3df-kube-api-access-69ttx\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:40 crc kubenswrapper[4823]: I1216 09:00:40.695590 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c28ab01-359f-4508-b5f3-a3847760477b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.144501 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hfdqc"] Dec 16 09:00:46 crc kubenswrapper[4823]: E1216 09:00:46.145415 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de9c6b0-50af-4139-b54c-eed47d4d804b" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145433 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de9c6b0-50af-4139-b54c-eed47d4d804b" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: E1216 09:00:46.145474 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8305f2ac-41bf-438a-a905-22723822520b" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145483 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8305f2ac-41bf-438a-a905-22723822520b" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: E1216 09:00:46.145502 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372f1ad6-80bd-4662-bf84-5c4bcc44bf05" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145511 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="372f1ad6-80bd-4662-bf84-5c4bcc44bf05" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: E1216 09:00:46.145521 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a0e7a5-3a19-4ec3-9e34-be161590ae54" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145529 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a0e7a5-3a19-4ec3-9e34-be161590ae54" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: E1216 09:00:46.145538 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c28ab01-359f-4508-b5f3-a3847760477b" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145546 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c28ab01-359f-4508-b5f3-a3847760477b" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: E1216 09:00:46.145561 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532985ed-811b-4627-80e7-4278a094a3df" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145568 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="532985ed-811b-4627-80e7-4278a094a3df" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145768 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c28ab01-359f-4508-b5f3-a3847760477b" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145784 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="532985ed-811b-4627-80e7-4278a094a3df" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145803 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de9c6b0-50af-4139-b54c-eed47d4d804b" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145814 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8305f2ac-41bf-438a-a905-22723822520b" containerName="mariadb-database-create" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145828 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="372f1ad6-80bd-4662-bf84-5c4bcc44bf05" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.145844 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a0e7a5-3a19-4ec3-9e34-be161590ae54" containerName="mariadb-account-create-update" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.146641 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.149460 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vfr64" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.149498 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.149825 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.166287 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hfdqc"] Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.319891 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-config-data\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.319972 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrdt\" (UniqueName: \"kubernetes.io/projected/b2c20fd5-7afe-45c9-aced-a02954774c3a-kube-api-access-tgrdt\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.320002 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.320435 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-scripts\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.421659 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-config-data\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.421747 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrdt\" (UniqueName: \"kubernetes.io/projected/b2c20fd5-7afe-45c9-aced-a02954774c3a-kube-api-access-tgrdt\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.421781 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.421866 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-scripts\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.427376 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.429328 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-config-data\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.429902 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-scripts\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.440709 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrdt\" (UniqueName: \"kubernetes.io/projected/b2c20fd5-7afe-45c9-aced-a02954774c3a-kube-api-access-tgrdt\") pod \"nova-cell0-conductor-db-sync-hfdqc\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.468527 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:00:46 crc kubenswrapper[4823]: I1216 09:00:46.956214 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hfdqc"] Dec 16 09:00:47 crc kubenswrapper[4823]: I1216 09:00:47.795273 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" event={"ID":"b2c20fd5-7afe-45c9-aced-a02954774c3a","Type":"ContainerStarted","Data":"815e64e1724a964f6a6b960df08c0d1497ded3d7e2e4626e6e8522ff4a2bfd78"} Dec 16 09:00:49 crc kubenswrapper[4823]: I1216 09:00:49.801977 4823 scope.go:117] "RemoveContainer" containerID="888b38634465b11546f4c380d25cb294453abb12e8470b429e8c3ade6a42b4cf" Dec 16 09:00:49 crc kubenswrapper[4823]: I1216 09:00:49.831927 4823 scope.go:117] "RemoveContainer" containerID="279887000a49b31ce5520a85a264f1dff04c4dbdf1c1ddc32ea6391952cfbe69" Dec 16 09:00:49 crc kubenswrapper[4823]: I1216 09:00:49.886681 4823 scope.go:117] "RemoveContainer" containerID="65d5616a1a18f1328b5785f9a717e1f7ae833ab7c8e50472712c115dba352311" Dec 16 09:00:57 crc kubenswrapper[4823]: I1216 09:00:57.886670 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" event={"ID":"b2c20fd5-7afe-45c9-aced-a02954774c3a","Type":"ContainerStarted","Data":"3a90c47a8cf666d0191a548e0fbb974d94f6605f8a15aeb046469781c6299ff1"} Dec 16 09:00:57 crc kubenswrapper[4823]: I1216 09:00:57.913423 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" podStartSLOduration=1.623465796 podStartE2EDuration="11.913403562s" podCreationTimestamp="2025-12-16 09:00:46 +0000 UTC" firstStartedPulling="2025-12-16 09:00:46.980182418 +0000 UTC m=+7525.468748541" lastFinishedPulling="2025-12-16 09:00:57.270120184 +0000 UTC m=+7535.758686307" observedRunningTime="2025-12-16 09:00:57.903615525 +0000 UTC m=+7536.392181648" watchObservedRunningTime="2025-12-16 09:00:57.913403562 +0000 UTC m=+7536.401969685" Dec 16 09:00:58 crc kubenswrapper[4823]: I1216 09:00:58.133979 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:00:58 crc kubenswrapper[4823]: I1216 09:00:58.134349 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.182046 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29431261-swvz9"] Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.183791 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.204068 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431261-swvz9"] Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.345699 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-combined-ca-bundle\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.346230 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9qh\" (UniqueName: \"kubernetes.io/projected/0dd89a4f-2cd4-4458-bb77-a56113b28c38-kube-api-access-rd9qh\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.346352 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-config-data\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.346396 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-fernet-keys\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.449108 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-combined-ca-bundle\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.449161 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9qh\" (UniqueName: \"kubernetes.io/projected/0dd89a4f-2cd4-4458-bb77-a56113b28c38-kube-api-access-rd9qh\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.449199 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-config-data\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.449220 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-fernet-keys\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.455968 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-config-data\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.457302 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-fernet-keys\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.458211 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-combined-ca-bundle\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.468934 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9qh\" (UniqueName: \"kubernetes.io/projected/0dd89a4f-2cd4-4458-bb77-a56113b28c38-kube-api-access-rd9qh\") pod \"keystone-cron-29431261-swvz9\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.514435 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:00 crc kubenswrapper[4823]: I1216 09:01:00.985946 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431261-swvz9"] Dec 16 09:01:01 crc kubenswrapper[4823]: I1216 09:01:01.933434 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-swvz9" event={"ID":"0dd89a4f-2cd4-4458-bb77-a56113b28c38","Type":"ContainerStarted","Data":"6c09a5c99fb275c16db9a38bca9fa510e6e4a0d570871a34f72960096724361b"} Dec 16 09:01:01 crc kubenswrapper[4823]: I1216 09:01:01.933482 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-swvz9" event={"ID":"0dd89a4f-2cd4-4458-bb77-a56113b28c38","Type":"ContainerStarted","Data":"d53da2d17323dedbe8d08b5bf9b72d6fb7758fa34dac5f0ecf1a7d6d7ce548a3"} Dec 16 09:01:01 crc kubenswrapper[4823]: I1216 09:01:01.960042 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29431261-swvz9" podStartSLOduration=1.96000518 podStartE2EDuration="1.96000518s" podCreationTimestamp="2025-12-16 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:01.956377307 +0000 UTC m=+7540.444943430" watchObservedRunningTime="2025-12-16 09:01:01.96000518 +0000 UTC m=+7540.448571303" Dec 16 09:01:04 crc kubenswrapper[4823]: I1216 09:01:04.961236 4823 generic.go:334] "Generic (PLEG): container finished" podID="b2c20fd5-7afe-45c9-aced-a02954774c3a" containerID="3a90c47a8cf666d0191a548e0fbb974d94f6605f8a15aeb046469781c6299ff1" exitCode=0 Dec 16 09:01:04 crc kubenswrapper[4823]: I1216 09:01:04.961509 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" event={"ID":"b2c20fd5-7afe-45c9-aced-a02954774c3a","Type":"ContainerDied","Data":"3a90c47a8cf666d0191a548e0fbb974d94f6605f8a15aeb046469781c6299ff1"} Dec 16 09:01:04 crc kubenswrapper[4823]: I1216 09:01:04.964651 4823 generic.go:334] "Generic (PLEG): container finished" podID="0dd89a4f-2cd4-4458-bb77-a56113b28c38" containerID="6c09a5c99fb275c16db9a38bca9fa510e6e4a0d570871a34f72960096724361b" exitCode=0 Dec 16 09:01:04 crc kubenswrapper[4823]: I1216 09:01:04.964686 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-swvz9" event={"ID":"0dd89a4f-2cd4-4458-bb77-a56113b28c38","Type":"ContainerDied","Data":"6c09a5c99fb275c16db9a38bca9fa510e6e4a0d570871a34f72960096724361b"} Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.432295 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.439622 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.701826 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-combined-ca-bundle\") pod \"b2c20fd5-7afe-45c9-aced-a02954774c3a\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.701920 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-combined-ca-bundle\") pod \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.701951 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-config-data\") pod \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.701983 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd9qh\" (UniqueName: \"kubernetes.io/projected/0dd89a4f-2cd4-4458-bb77-a56113b28c38-kube-api-access-rd9qh\") pod \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.702036 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-config-data\") pod \"b2c20fd5-7afe-45c9-aced-a02954774c3a\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.702072 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-scripts\") pod \"b2c20fd5-7afe-45c9-aced-a02954774c3a\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.702143 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrdt\" (UniqueName: \"kubernetes.io/projected/b2c20fd5-7afe-45c9-aced-a02954774c3a-kube-api-access-tgrdt\") pod \"b2c20fd5-7afe-45c9-aced-a02954774c3a\" (UID: \"b2c20fd5-7afe-45c9-aced-a02954774c3a\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.702228 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-fernet-keys\") pod \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\" (UID: \"0dd89a4f-2cd4-4458-bb77-a56113b28c38\") " Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.707148 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c20fd5-7afe-45c9-aced-a02954774c3a-kube-api-access-tgrdt" (OuterVolumeSpecName: "kube-api-access-tgrdt") pod "b2c20fd5-7afe-45c9-aced-a02954774c3a" (UID: "b2c20fd5-7afe-45c9-aced-a02954774c3a"). InnerVolumeSpecName "kube-api-access-tgrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.709012 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-scripts" (OuterVolumeSpecName: "scripts") pod "b2c20fd5-7afe-45c9-aced-a02954774c3a" (UID: "b2c20fd5-7afe-45c9-aced-a02954774c3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.711019 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd89a4f-2cd4-4458-bb77-a56113b28c38-kube-api-access-rd9qh" (OuterVolumeSpecName: "kube-api-access-rd9qh") pod "0dd89a4f-2cd4-4458-bb77-a56113b28c38" (UID: "0dd89a4f-2cd4-4458-bb77-a56113b28c38"). InnerVolumeSpecName "kube-api-access-rd9qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.714150 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0dd89a4f-2cd4-4458-bb77-a56113b28c38" (UID: "0dd89a4f-2cd4-4458-bb77-a56113b28c38"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.734790 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-config-data" (OuterVolumeSpecName: "config-data") pod "b2c20fd5-7afe-45c9-aced-a02954774c3a" (UID: "b2c20fd5-7afe-45c9-aced-a02954774c3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.749255 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dd89a4f-2cd4-4458-bb77-a56113b28c38" (UID: "0dd89a4f-2cd4-4458-bb77-a56113b28c38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.755064 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2c20fd5-7afe-45c9-aced-a02954774c3a" (UID: "b2c20fd5-7afe-45c9-aced-a02954774c3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.762314 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-config-data" (OuterVolumeSpecName: "config-data") pod "0dd89a4f-2cd4-4458-bb77-a56113b28c38" (UID: "0dd89a4f-2cd4-4458-bb77-a56113b28c38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805390 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805552 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805581 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd9qh\" (UniqueName: \"kubernetes.io/projected/0dd89a4f-2cd4-4458-bb77-a56113b28c38-kube-api-access-rd9qh\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805595 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805604 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805615 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgrdt\" (UniqueName: \"kubernetes.io/projected/b2c20fd5-7afe-45c9-aced-a02954774c3a-kube-api-access-tgrdt\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805625 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd89a4f-2cd4-4458-bb77-a56113b28c38-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.805635 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2c20fd5-7afe-45c9-aced-a02954774c3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.986369 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.986519 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hfdqc" event={"ID":"b2c20fd5-7afe-45c9-aced-a02954774c3a","Type":"ContainerDied","Data":"815e64e1724a964f6a6b960df08c0d1497ded3d7e2e4626e6e8522ff4a2bfd78"} Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.986551 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815e64e1724a964f6a6b960df08c0d1497ded3d7e2e4626e6e8522ff4a2bfd78" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.991200 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-swvz9" event={"ID":"0dd89a4f-2cd4-4458-bb77-a56113b28c38","Type":"ContainerDied","Data":"d53da2d17323dedbe8d08b5bf9b72d6fb7758fa34dac5f0ecf1a7d6d7ce548a3"} Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.991250 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53da2d17323dedbe8d08b5bf9b72d6fb7758fa34dac5f0ecf1a7d6d7ce548a3" Dec 16 09:01:06 crc kubenswrapper[4823]: I1216 09:01:06.991306 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-swvz9" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.074966 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:01:07 crc kubenswrapper[4823]: E1216 09:01:07.075377 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd89a4f-2cd4-4458-bb77-a56113b28c38" containerName="keystone-cron" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.075395 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd89a4f-2cd4-4458-bb77-a56113b28c38" containerName="keystone-cron" Dec 16 09:01:07 crc kubenswrapper[4823]: E1216 09:01:07.075408 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c20fd5-7afe-45c9-aced-a02954774c3a" containerName="nova-cell0-conductor-db-sync" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.075428 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c20fd5-7afe-45c9-aced-a02954774c3a" containerName="nova-cell0-conductor-db-sync" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.075601 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c20fd5-7afe-45c9-aced-a02954774c3a" containerName="nova-cell0-conductor-db-sync" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.075627 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd89a4f-2cd4-4458-bb77-a56113b28c38" containerName="keystone-cron" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.076243 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.079545 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vfr64" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.079720 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.084823 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.211215 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtgv4\" (UniqueName: \"kubernetes.io/projected/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-kube-api-access-jtgv4\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.211566 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.212305 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.313928 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtgv4\" (UniqueName: \"kubernetes.io/projected/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-kube-api-access-jtgv4\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.314436 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.314658 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.320972 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.321387 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.332361 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtgv4\" (UniqueName: \"kubernetes.io/projected/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-kube-api-access-jtgv4\") pod \"nova-cell0-conductor-0\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.450256 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:07 crc kubenswrapper[4823]: I1216 09:01:07.877474 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:01:08 crc kubenswrapper[4823]: I1216 09:01:08.008651 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7","Type":"ContainerStarted","Data":"671fa1c6a0651c345a57242308c25f95f53cda61ca020b343e7c592be1c3a231"} Dec 16 09:01:09 crc kubenswrapper[4823]: I1216 09:01:09.020271 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7","Type":"ContainerStarted","Data":"0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa"} Dec 16 09:01:09 crc kubenswrapper[4823]: I1216 09:01:09.020662 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.302476 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=8.302450675 podStartE2EDuration="8.302450675s" podCreationTimestamp="2025-12-16 09:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:09.039491893 +0000 UTC m=+7547.528058016" watchObservedRunningTime="2025-12-16 09:01:15.302450675 +0000 UTC m=+7553.791016828" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.309721 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k9mtc"] Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.312383 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.318132 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k9mtc"] Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.480755 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-catalog-content\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.480839 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nxz\" (UniqueName: \"kubernetes.io/projected/1989b143-cd57-41c6-9174-e8067cbc491f-kube-api-access-w9nxz\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.480867 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-utilities\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.582736 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-catalog-content\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.582841 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nxz\" (UniqueName: \"kubernetes.io/projected/1989b143-cd57-41c6-9174-e8067cbc491f-kube-api-access-w9nxz\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.582878 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-utilities\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.583367 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-catalog-content\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.584283 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-utilities\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.607851 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nxz\" (UniqueName: \"kubernetes.io/projected/1989b143-cd57-41c6-9174-e8067cbc491f-kube-api-access-w9nxz\") pod \"community-operators-k9mtc\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:15 crc kubenswrapper[4823]: I1216 09:01:15.659140 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:16 crc kubenswrapper[4823]: I1216 09:01:16.271801 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k9mtc"] Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.052410 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b1da-account-create-update-qx8k4"] Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.060950 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jthq4"] Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.069323 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jthq4"] Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.077677 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b1da-account-create-update-qx8k4"] Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.112966 4823 generic.go:334] "Generic (PLEG): container finished" podID="1989b143-cd57-41c6-9174-e8067cbc491f" containerID="3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f" exitCode=0 Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.113015 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerDied","Data":"3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f"} Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.113098 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerStarted","Data":"73e86d437fa925e89e08a7e70533b34054a6848460c20f8bd456bd549c0e8ed5"} Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.116339 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.480527 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.785376 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece" path="/var/lib/kubelet/pods/17b9ddf1-8b90-4e7d-8edf-09b5d2b2cece/volumes" Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.786147 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1954da4f-6645-49fa-89b4-1e3f2c0284b2" path="/var/lib/kubelet/pods/1954da4f-6645-49fa-89b4-1e3f2c0284b2/volumes" Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.957187 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4d28k"] Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.958844 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.960549 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.960734 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 09:01:17 crc kubenswrapper[4823]: I1216 09:01:17.965592 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4d28k"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.133865 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-scripts\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.134374 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-config-data\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.134629 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.134666 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pr87\" (UniqueName: \"kubernetes.io/projected/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-kube-api-access-2pr87\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.144579 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerStarted","Data":"343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc"} Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.171235 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.175527 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.179342 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.183160 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.184479 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.192743 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.199913 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.210271 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.241769 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242113 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pr87\" (UniqueName: \"kubernetes.io/projected/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-kube-api-access-2pr87\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242148 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vpr\" (UniqueName: \"kubernetes.io/projected/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-kube-api-access-n2vpr\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242180 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-scripts\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242216 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242238 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242259 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242305 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242334 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x72q\" (UniqueName: \"kubernetes.io/projected/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-kube-api-access-2x72q\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.242386 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-config-data\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.254607 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-config-data\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.255903 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.291833 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.293605 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.298580 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.304546 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-scripts\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.328518 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pr87\" (UniqueName: \"kubernetes.io/projected/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-kube-api-access-2pr87\") pod \"nova-cell0-cell-mapping-4d28k\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344145 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768rv\" (UniqueName: \"kubernetes.io/projected/61e3b1a8-261a-4e24-a70d-7c460c4505bf-kube-api-access-768rv\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344207 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344240 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x72q\" (UniqueName: \"kubernetes.io/projected/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-kube-api-access-2x72q\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344349 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344415 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-config-data\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344491 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vpr\" (UniqueName: \"kubernetes.io/projected/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-kube-api-access-n2vpr\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344546 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e3b1a8-261a-4e24-a70d-7c460c4505bf-logs\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344579 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344609 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.344636 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.350355 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.352849 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.362235 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.363175 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.367701 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.401847 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vpr\" (UniqueName: \"kubernetes.io/projected/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-kube-api-access-n2vpr\") pod \"nova-scheduler-0\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.412690 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x72q\" (UniqueName: \"kubernetes.io/projected/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-kube-api-access-2x72q\") pod \"nova-cell1-novncproxy-0\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.451426 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.451485 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-config-data\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.451564 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e3b1a8-261a-4e24-a70d-7c460c4505bf-logs\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.451613 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768rv\" (UniqueName: \"kubernetes.io/projected/61e3b1a8-261a-4e24-a70d-7c460c4505bf-kube-api-access-768rv\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.456467 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e3b1a8-261a-4e24-a70d-7c460c4505bf-logs\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.463003 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-config-data\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.468719 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.484695 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768rv\" (UniqueName: \"kubernetes.io/projected/61e3b1a8-261a-4e24-a70d-7c460c4505bf-kube-api-access-768rv\") pod \"nova-metadata-0\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.498447 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.501942 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.503886 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.510675 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.511091 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-555fc99bf5-94c6p"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.512982 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.524502 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.525355 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-555fc99bf5-94c6p"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.549428 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.557254 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-config-data\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.557311 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587d1019-58f8-48c1-98f9-6f1bc724d7f1-logs\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.557459 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxgq\" (UniqueName: \"kubernetes.io/projected/587d1019-58f8-48c1-98f9-6f1bc724d7f1-kube-api-access-8fxgq\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.557535 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.575088 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.616315 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662125 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxgq\" (UniqueName: \"kubernetes.io/projected/587d1019-58f8-48c1-98f9-6f1bc724d7f1-kube-api-access-8fxgq\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662210 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nckxs\" (UniqueName: \"kubernetes.io/projected/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-kube-api-access-nckxs\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662250 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-dns-svc\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662294 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662324 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-nb\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662393 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-config\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662428 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-config-data\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662452 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587d1019-58f8-48c1-98f9-6f1bc724d7f1-logs\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.662480 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-sb\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.667291 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587d1019-58f8-48c1-98f9-6f1bc724d7f1-logs\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.680643 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.682020 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-config-data\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.684233 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxgq\" (UniqueName: \"kubernetes.io/projected/587d1019-58f8-48c1-98f9-6f1bc724d7f1-kube-api-access-8fxgq\") pod \"nova-api-0\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.764078 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nckxs\" (UniqueName: \"kubernetes.io/projected/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-kube-api-access-nckxs\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.764137 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-dns-svc\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.764184 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-nb\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.764243 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-config\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.764269 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-sb\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.828768 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.923306 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-nb\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.923306 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-config\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.925918 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-dns-svc\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.927646 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-sb\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:18 crc kubenswrapper[4823]: I1216 09:01:18.933821 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nckxs\" (UniqueName: \"kubernetes.io/projected/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-kube-api-access-nckxs\") pod \"dnsmasq-dns-555fc99bf5-94c6p\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.170270 4823 generic.go:334] "Generic (PLEG): container finished" podID="1989b143-cd57-41c6-9174-e8067cbc491f" containerID="343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc" exitCode=0 Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.170315 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerDied","Data":"343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc"} Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.177911 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.401120 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vkmfp"] Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.402709 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.404766 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.405217 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.416865 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:19 crc kubenswrapper[4823]: W1216 09:01:19.419865 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a08b79_3be2_4b2b_b41e_bb87c1f2e0d5.slice/crio-5eafa09d6164dfaf5d1a6e135e7c5069c60ebe1743b6df880612ebf7d3f807e6 WatchSource:0}: Error finding container 5eafa09d6164dfaf5d1a6e135e7c5069c60ebe1743b6df880612ebf7d3f807e6: Status 404 returned error can't find the container with id 5eafa09d6164dfaf5d1a6e135e7c5069c60ebe1743b6df880612ebf7d3f807e6 Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.429059 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vkmfp"] Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.482536 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-config-data\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.482708 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-scripts\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.482809 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2vqv\" (UniqueName: \"kubernetes.io/projected/4b887758-8834-4b33-8569-44799548f791-kube-api-access-f2vqv\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.482847 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.586371 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.586450 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-config-data\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.586573 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-scripts\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.586653 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2vqv\" (UniqueName: \"kubernetes.io/projected/4b887758-8834-4b33-8569-44799548f791-kube-api-access-f2vqv\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.593945 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.599599 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-scripts\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.608447 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.610390 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-config-data\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.614012 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2vqv\" (UniqueName: \"kubernetes.io/projected/4b887758-8834-4b33-8569-44799548f791-kube-api-access-f2vqv\") pod \"nova-cell1-conductor-db-sync-vkmfp\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.631811 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.733784 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.757927 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4d28k"] Dec 16 09:01:19 crc kubenswrapper[4823]: W1216 09:01:19.779853 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc99ed3d_7cdb_4152_b23e_05096c7dd4cb.slice/crio-d94647bda8db134befb342c2d0b9c716ce38a1908cc3abdec99d70616c634739 WatchSource:0}: Error finding container d94647bda8db134befb342c2d0b9c716ce38a1908cc3abdec99d70616c634739: Status 404 returned error can't find the container with id d94647bda8db134befb342c2d0b9c716ce38a1908cc3abdec99d70616c634739 Dec 16 09:01:19 crc kubenswrapper[4823]: I1216 09:01:19.946173 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.039246 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-555fc99bf5-94c6p"] Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.179907 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"587d1019-58f8-48c1-98f9-6f1bc724d7f1","Type":"ContainerStarted","Data":"86ac600122427487d4470a19585cc096b322271c4bbfbee34be3466ddbf69ddb"} Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.181974 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e","Type":"ContainerStarted","Data":"7d6d85777ccf63c7aa3179a7d444a8d776eb5409749bf97b72bb553740b1ef24"} Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.182987 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" event={"ID":"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d","Type":"ContainerStarted","Data":"f2a6b7c8059278d095efb06abff798aca06ec0cbdb3fc30fcf899baebf63f117"} Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.184556 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5","Type":"ContainerStarted","Data":"5eafa09d6164dfaf5d1a6e135e7c5069c60ebe1743b6df880612ebf7d3f807e6"} Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.186789 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4d28k" event={"ID":"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb","Type":"ContainerStarted","Data":"d94647bda8db134befb342c2d0b9c716ce38a1908cc3abdec99d70616c634739"} Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.187770 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61e3b1a8-261a-4e24-a70d-7c460c4505bf","Type":"ContainerStarted","Data":"197991325a20db40d51ef9e64d5ba1ac2d9eee74ed199d42fc020be136c92d35"} Dec 16 09:01:20 crc kubenswrapper[4823]: I1216 09:01:20.241833 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vkmfp"] Dec 16 09:01:20 crc kubenswrapper[4823]: W1216 09:01:20.246432 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b887758_8834_4b33_8569_44799548f791.slice/crio-fffab2769bd2beed8fd51c9d6b202eccc037daf0558fc043f9065e6fc530b727 WatchSource:0}: Error finding container fffab2769bd2beed8fd51c9d6b202eccc037daf0558fc043f9065e6fc530b727: Status 404 returned error can't find the container with id fffab2769bd2beed8fd51c9d6b202eccc037daf0558fc043f9065e6fc530b727 Dec 16 09:01:21 crc kubenswrapper[4823]: I1216 09:01:21.197669 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" event={"ID":"4b887758-8834-4b33-8569-44799548f791","Type":"ContainerStarted","Data":"fffab2769bd2beed8fd51c9d6b202eccc037daf0558fc043f9065e6fc530b727"} Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.028856 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.039735 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.216738 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4d28k" event={"ID":"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb","Type":"ContainerStarted","Data":"4abb0fa0b08dc3eb1d6b6ce1c34e2a3bc2f4171afda40aa20a25497cce168b3b"} Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.220267 4823 generic.go:334] "Generic (PLEG): container finished" podID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerID="73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a" exitCode=0 Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.220341 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" event={"ID":"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d","Type":"ContainerDied","Data":"73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a"} Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.225905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" event={"ID":"4b887758-8834-4b33-8569-44799548f791","Type":"ContainerStarted","Data":"86be94bdbff4c07beea3917efb385bd5395bed9cd2e2647743ab02d6da764784"} Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.233383 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4d28k" podStartSLOduration=6.233363568 podStartE2EDuration="6.233363568s" podCreationTimestamp="2025-12-16 09:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:23.232912433 +0000 UTC m=+7561.721478566" watchObservedRunningTime="2025-12-16 09:01:23.233363568 +0000 UTC m=+7561.721929691" Dec 16 09:01:23 crc kubenswrapper[4823]: I1216 09:01:23.257233 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" podStartSLOduration=4.257210764 podStartE2EDuration="4.257210764s" podCreationTimestamp="2025-12-16 09:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:23.250167824 +0000 UTC m=+7561.738733957" watchObservedRunningTime="2025-12-16 09:01:23.257210764 +0000 UTC m=+7561.745776887" Dec 16 09:01:24 crc kubenswrapper[4823]: I1216 09:01:24.239282 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerStarted","Data":"eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32"} Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.301072 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" event={"ID":"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d","Type":"ContainerStarted","Data":"d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03"} Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.301444 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.333244 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" podStartSLOduration=7.333218748 podStartE2EDuration="7.333218748s" podCreationTimestamp="2025-12-16 09:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:25.327517299 +0000 UTC m=+7563.816083422" watchObservedRunningTime="2025-12-16 09:01:25.333218748 +0000 UTC m=+7563.821784871" Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.351305 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dbe718cd84e6bce36c7fc414bebbd801211bd5d70b48892ad3e8b383788cc6e2" gracePeriod=30 Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.351971 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5","Type":"ContainerStarted","Data":"dbe718cd84e6bce36c7fc414bebbd801211bd5d70b48892ad3e8b383788cc6e2"} Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.403960 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k9mtc" podStartSLOduration=4.075936208 podStartE2EDuration="10.40393356s" podCreationTimestamp="2025-12-16 09:01:15 +0000 UTC" firstStartedPulling="2025-12-16 09:01:17.115899904 +0000 UTC m=+7555.604466037" lastFinishedPulling="2025-12-16 09:01:23.443897266 +0000 UTC m=+7561.932463389" observedRunningTime="2025-12-16 09:01:25.401321499 +0000 UTC m=+7563.889887622" watchObservedRunningTime="2025-12-16 09:01:25.40393356 +0000 UTC m=+7563.892499683" Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.660821 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:25 crc kubenswrapper[4823]: I1216 09:01:25.661488 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:26 crc kubenswrapper[4823]: I1216 09:01:26.361769 4823 generic.go:334] "Generic (PLEG): container finished" podID="4b887758-8834-4b33-8569-44799548f791" containerID="86be94bdbff4c07beea3917efb385bd5395bed9cd2e2647743ab02d6da764784" exitCode=0 Dec 16 09:01:26 crc kubenswrapper[4823]: I1216 09:01:26.362610 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" event={"ID":"4b887758-8834-4b33-8569-44799548f791","Type":"ContainerDied","Data":"86be94bdbff4c07beea3917efb385bd5395bed9cd2e2647743ab02d6da764784"} Dec 16 09:01:26 crc kubenswrapper[4823]: I1216 09:01:26.383278 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.281893248 podStartE2EDuration="8.38325782s" podCreationTimestamp="2025-12-16 09:01:18 +0000 UTC" firstStartedPulling="2025-12-16 09:01:19.421539127 +0000 UTC m=+7557.910105250" lastFinishedPulling="2025-12-16 09:01:23.522903699 +0000 UTC m=+7562.011469822" observedRunningTime="2025-12-16 09:01:25.46047035 +0000 UTC m=+7563.949036473" watchObservedRunningTime="2025-12-16 09:01:26.38325782 +0000 UTC m=+7564.871823943" Dec 16 09:01:26 crc kubenswrapper[4823]: I1216 09:01:26.707660 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-k9mtc" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="registry-server" probeResult="failure" output=< Dec 16 09:01:26 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 09:01:26 crc kubenswrapper[4823]: > Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.133910 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.134521 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.134582 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.135407 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4d9ea4299c018a902750aabeef9dea06ce13b6e55f03c5913f1f492b4b19163"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.135488 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://c4d9ea4299c018a902750aabeef9dea06ce13b6e55f03c5913f1f492b4b19163" gracePeriod=600 Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.147592 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.186328 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-config-data\") pod \"4b887758-8834-4b33-8569-44799548f791\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.186462 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-scripts\") pod \"4b887758-8834-4b33-8569-44799548f791\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.186587 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2vqv\" (UniqueName: \"kubernetes.io/projected/4b887758-8834-4b33-8569-44799548f791-kube-api-access-f2vqv\") pod \"4b887758-8834-4b33-8569-44799548f791\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.186644 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-combined-ca-bundle\") pod \"4b887758-8834-4b33-8569-44799548f791\" (UID: \"4b887758-8834-4b33-8569-44799548f791\") " Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.195119 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b887758-8834-4b33-8569-44799548f791-kube-api-access-f2vqv" (OuterVolumeSpecName: "kube-api-access-f2vqv") pod "4b887758-8834-4b33-8569-44799548f791" (UID: "4b887758-8834-4b33-8569-44799548f791"). InnerVolumeSpecName "kube-api-access-f2vqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.245786 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-scripts" (OuterVolumeSpecName: "scripts") pod "4b887758-8834-4b33-8569-44799548f791" (UID: "4b887758-8834-4b33-8569-44799548f791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.289175 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.289212 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2vqv\" (UniqueName: \"kubernetes.io/projected/4b887758-8834-4b33-8569-44799548f791-kube-api-access-f2vqv\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.315691 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-config-data" (OuterVolumeSpecName: "config-data") pod "4b887758-8834-4b33-8569-44799548f791" (UID: "4b887758-8834-4b33-8569-44799548f791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.323537 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b887758-8834-4b33-8569-44799548f791" (UID: "4b887758-8834-4b33-8569-44799548f791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.392571 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.392610 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b887758-8834-4b33-8569-44799548f791-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.395555 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="c4d9ea4299c018a902750aabeef9dea06ce13b6e55f03c5913f1f492b4b19163" exitCode=0 Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.395642 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"c4d9ea4299c018a902750aabeef9dea06ce13b6e55f03c5913f1f492b4b19163"} Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.395711 4823 scope.go:117] "RemoveContainer" containerID="9ce3e6cc66a3ba1f5a9f07614bbf78a449581b45707f8e1e5d9794f67e5c0428" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.398013 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.401074 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vkmfp" event={"ID":"4b887758-8834-4b33-8569-44799548f791","Type":"ContainerDied","Data":"fffab2769bd2beed8fd51c9d6b202eccc037daf0558fc043f9065e6fc530b727"} Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.401155 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fffab2769bd2beed8fd51c9d6b202eccc037daf0558fc043f9065e6fc530b727" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.404760 4823 generic.go:334] "Generic (PLEG): container finished" podID="cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" containerID="4abb0fa0b08dc3eb1d6b6ce1c34e2a3bc2f4171afda40aa20a25497cce168b3b" exitCode=0 Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.404805 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4d28k" event={"ID":"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb","Type":"ContainerDied","Data":"4abb0fa0b08dc3eb1d6b6ce1c34e2a3bc2f4171afda40aa20a25497cce168b3b"} Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.455997 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:01:28 crc kubenswrapper[4823]: E1216 09:01:28.456423 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b887758-8834-4b33-8569-44799548f791" containerName="nova-cell1-conductor-db-sync" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.456440 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b887758-8834-4b33-8569-44799548f791" containerName="nova-cell1-conductor-db-sync" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.456615 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b887758-8834-4b33-8569-44799548f791" containerName="nova-cell1-conductor-db-sync" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.457255 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.460825 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.466792 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.493953 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.494106 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n9wz\" (UniqueName: \"kubernetes.io/projected/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-kube-api-access-4n9wz\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.494279 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.550158 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.596156 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.596267 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9wz\" (UniqueName: \"kubernetes.io/projected/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-kube-api-access-4n9wz\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.596409 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.600234 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.600644 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.612825 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9wz\" (UniqueName: \"kubernetes.io/projected/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-kube-api-access-4n9wz\") pod \"nova-cell1-conductor-0\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:28 crc kubenswrapper[4823]: I1216 09:01:28.786167 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.180187 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.255672 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-585657f749-s2nbz"] Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.256002 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-585657f749-s2nbz" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerName="dnsmasq-dns" containerID="cri-o://41af04c1b1d3d7fba526b58071e84a55f4d77a41c033341e492f0f1febfa391f" gracePeriod=10 Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.349571 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:01:29 crc kubenswrapper[4823]: W1216 09:01:29.352826 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd92bc3_eaf0_4217_bcac_dd8f41db9edf.slice/crio-8a269b71857456478d8798472fe05b6a0f92634ee338462d971404821b95ca84 WatchSource:0}: Error finding container 8a269b71857456478d8798472fe05b6a0f92634ee338462d971404821b95ca84: Status 404 returned error can't find the container with id 8a269b71857456478d8798472fe05b6a0f92634ee338462d971404821b95ca84 Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.416590 4823 generic.go:334] "Generic (PLEG): container finished" podID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerID="41af04c1b1d3d7fba526b58071e84a55f4d77a41c033341e492f0f1febfa391f" exitCode=0 Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.416662 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-585657f749-s2nbz" event={"ID":"1feea07f-4c8b-4c90-8f3f-63e810a7a525","Type":"ContainerDied","Data":"41af04c1b1d3d7fba526b58071e84a55f4d77a41c033341e492f0f1febfa391f"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.433054 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61e3b1a8-261a-4e24-a70d-7c460c4505bf","Type":"ContainerStarted","Data":"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.433105 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61e3b1a8-261a-4e24-a70d-7c460c4505bf","Type":"ContainerStarted","Data":"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.433240 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-log" containerID="cri-o://a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0" gracePeriod=30 Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.433615 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-metadata" containerID="cri-o://1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f" gracePeriod=30 Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.442410 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"587d1019-58f8-48c1-98f9-6f1bc724d7f1","Type":"ContainerStarted","Data":"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.442459 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"587d1019-58f8-48c1-98f9-6f1bc724d7f1","Type":"ContainerStarted","Data":"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.444679 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.455391 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf","Type":"ContainerStarted","Data":"8a269b71857456478d8798472fe05b6a0f92634ee338462d971404821b95ca84"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.463086 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e","Type":"ContainerStarted","Data":"4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2"} Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.472928 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.193708982 podStartE2EDuration="11.472901189s" podCreationTimestamp="2025-12-16 09:01:18 +0000 UTC" firstStartedPulling="2025-12-16 09:01:19.627120381 +0000 UTC m=+7558.115686504" lastFinishedPulling="2025-12-16 09:01:27.906312588 +0000 UTC m=+7566.394878711" observedRunningTime="2025-12-16 09:01:29.457433394 +0000 UTC m=+7567.945999527" watchObservedRunningTime="2025-12-16 09:01:29.472901189 +0000 UTC m=+7567.961467322" Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.516098 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.57193924 podStartE2EDuration="11.51607734s" podCreationTimestamp="2025-12-16 09:01:18 +0000 UTC" firstStartedPulling="2025-12-16 09:01:19.960386282 +0000 UTC m=+7558.448952405" lastFinishedPulling="2025-12-16 09:01:27.904524372 +0000 UTC m=+7566.393090505" observedRunningTime="2025-12-16 09:01:29.494299108 +0000 UTC m=+7567.982865231" watchObservedRunningTime="2025-12-16 09:01:29.51607734 +0000 UTC m=+7568.004643463" Dec 16 09:01:29 crc kubenswrapper[4823]: I1216 09:01:29.519553 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.256236629 podStartE2EDuration="11.519535728s" podCreationTimestamp="2025-12-16 09:01:18 +0000 UTC" firstStartedPulling="2025-12-16 09:01:19.641056838 +0000 UTC m=+7558.129622961" lastFinishedPulling="2025-12-16 09:01:27.904355927 +0000 UTC m=+7566.392922060" observedRunningTime="2025-12-16 09:01:29.514107128 +0000 UTC m=+7568.002673261" watchObservedRunningTime="2025-12-16 09:01:29.519535728 +0000 UTC m=+7568.008101851" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.073929 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.234721 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.255376 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-sb\") pod \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.255998 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkzjb\" (UniqueName: \"kubernetes.io/projected/1feea07f-4c8b-4c90-8f3f-63e810a7a525-kube-api-access-mkzjb\") pod \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.256872 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pr87\" (UniqueName: \"kubernetes.io/projected/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-kube-api-access-2pr87\") pod \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.256936 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-scripts\") pod \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.256973 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-dns-svc\") pod \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.256997 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-combined-ca-bundle\") pod \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.257037 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-config-data\") pod \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\" (UID: \"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.257071 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-nb\") pod \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.257113 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config\") pod \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.262597 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-kube-api-access-2pr87" (OuterVolumeSpecName: "kube-api-access-2pr87") pod "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" (UID: "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb"). InnerVolumeSpecName "kube-api-access-2pr87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.262772 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-scripts" (OuterVolumeSpecName: "scripts") pod "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" (UID: "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.266415 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feea07f-4c8b-4c90-8f3f-63e810a7a525-kube-api-access-mkzjb" (OuterVolumeSpecName: "kube-api-access-mkzjb") pod "1feea07f-4c8b-4c90-8f3f-63e810a7a525" (UID: "1feea07f-4c8b-4c90-8f3f-63e810a7a525"). InnerVolumeSpecName "kube-api-access-mkzjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.312034 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" (UID: "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.316205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-config-data" (OuterVolumeSpecName: "config-data") pod "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" (UID: "cc99ed3d-7cdb-4152-b23e-05096c7dd4cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.336862 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.365251 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e3b1a8-261a-4e24-a70d-7c460c4505bf-logs\") pod \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.366682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config" (OuterVolumeSpecName: "config") pod "1feea07f-4c8b-4c90-8f3f-63e810a7a525" (UID: "1feea07f-4c8b-4c90-8f3f-63e810a7a525"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.366816 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61e3b1a8-261a-4e24-a70d-7c460c4505bf-logs" (OuterVolumeSpecName: "logs") pod "61e3b1a8-261a-4e24-a70d-7c460c4505bf" (UID: "61e3b1a8-261a-4e24-a70d-7c460c4505bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.380520 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-combined-ca-bundle\") pod \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.380565 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config\") pod \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\" (UID: \"1feea07f-4c8b-4c90-8f3f-63e810a7a525\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.380610 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-768rv\" (UniqueName: \"kubernetes.io/projected/61e3b1a8-261a-4e24-a70d-7c460c4505bf-kube-api-access-768rv\") pod \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.380649 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-config-data\") pod \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\" (UID: \"61e3b1a8-261a-4e24-a70d-7c460c4505bf\") " Dec 16 09:01:30 crc kubenswrapper[4823]: W1216 09:01:30.381006 4823 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1feea07f-4c8b-4c90-8f3f-63e810a7a525/volumes/kubernetes.io~configmap/config Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381042 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config" (OuterVolumeSpecName: "config") pod "1feea07f-4c8b-4c90-8f3f-63e810a7a525" (UID: "1feea07f-4c8b-4c90-8f3f-63e810a7a525"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381589 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkzjb\" (UniqueName: \"kubernetes.io/projected/1feea07f-4c8b-4c90-8f3f-63e810a7a525-kube-api-access-mkzjb\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381611 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e3b1a8-261a-4e24-a70d-7c460c4505bf-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381639 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pr87\" (UniqueName: \"kubernetes.io/projected/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-kube-api-access-2pr87\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381650 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381690 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381700 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.381711 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.388590 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1feea07f-4c8b-4c90-8f3f-63e810a7a525" (UID: "1feea07f-4c8b-4c90-8f3f-63e810a7a525"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.393339 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e3b1a8-261a-4e24-a70d-7c460c4505bf-kube-api-access-768rv" (OuterVolumeSpecName: "kube-api-access-768rv") pod "61e3b1a8-261a-4e24-a70d-7c460c4505bf" (UID: "61e3b1a8-261a-4e24-a70d-7c460c4505bf"). InnerVolumeSpecName "kube-api-access-768rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.393577 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1feea07f-4c8b-4c90-8f3f-63e810a7a525" (UID: "1feea07f-4c8b-4c90-8f3f-63e810a7a525"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.416431 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1feea07f-4c8b-4c90-8f3f-63e810a7a525" (UID: "1feea07f-4c8b-4c90-8f3f-63e810a7a525"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.432924 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-config-data" (OuterVolumeSpecName: "config-data") pod "61e3b1a8-261a-4e24-a70d-7c460c4505bf" (UID: "61e3b1a8-261a-4e24-a70d-7c460c4505bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.444165 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61e3b1a8-261a-4e24-a70d-7c460c4505bf" (UID: "61e3b1a8-261a-4e24-a70d-7c460c4505bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.472975 4823 generic.go:334] "Generic (PLEG): container finished" podID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerID="1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f" exitCode=0 Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.473009 4823 generic.go:334] "Generic (PLEG): container finished" podID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerID="a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0" exitCode=143 Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.473042 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.473058 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61e3b1a8-261a-4e24-a70d-7c460c4505bf","Type":"ContainerDied","Data":"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f"} Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.473108 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61e3b1a8-261a-4e24-a70d-7c460c4505bf","Type":"ContainerDied","Data":"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0"} Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.473118 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61e3b1a8-261a-4e24-a70d-7c460c4505bf","Type":"ContainerDied","Data":"197991325a20db40d51ef9e64d5ba1ac2d9eee74ed199d42fc020be136c92d35"} Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.473135 4823 scope.go:117] "RemoveContainer" containerID="1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.480656 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf","Type":"ContainerStarted","Data":"f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9"} Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483775 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483819 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483834 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483848 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-768rv\" (UniqueName: \"kubernetes.io/projected/61e3b1a8-261a-4e24-a70d-7c460c4505bf-kube-api-access-768rv\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483857 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e3b1a8-261a-4e24-a70d-7c460c4505bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483866 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1feea07f-4c8b-4c90-8f3f-63e810a7a525-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.483933 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.486302 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-585657f749-s2nbz" event={"ID":"1feea07f-4c8b-4c90-8f3f-63e810a7a525","Type":"ContainerDied","Data":"85edd95dd769f7e49d143954f68c8d5f9163d250f1893d298cc52bcb61a678f5"} Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.486326 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-585657f749-s2nbz" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.487549 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4d28k" event={"ID":"cc99ed3d-7cdb-4152-b23e-05096c7dd4cb","Type":"ContainerDied","Data":"d94647bda8db134befb342c2d0b9c716ce38a1908cc3abdec99d70616c634739"} Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.487574 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4d28k" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.487669 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d94647bda8db134befb342c2d0b9c716ce38a1908cc3abdec99d70616c634739" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.499867 4823 scope.go:117] "RemoveContainer" containerID="a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.518745 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.51872213 podStartE2EDuration="2.51872213s" podCreationTimestamp="2025-12-16 09:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:30.50623749 +0000 UTC m=+7568.994803613" watchObservedRunningTime="2025-12-16 09:01:30.51872213 +0000 UTC m=+7569.007288253" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.562582 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.567730 4823 scope.go:117] "RemoveContainer" containerID="1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f" Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.568125 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f\": container with ID starting with 1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f not found: ID does not exist" containerID="1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.568167 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f"} err="failed to get container status \"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f\": rpc error: code = NotFound desc = could not find container \"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f\": container with ID starting with 1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f not found: ID does not exist" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.568195 4823 scope.go:117] "RemoveContainer" containerID="a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0" Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.568475 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0\": container with ID starting with a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0 not found: ID does not exist" containerID="a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.568499 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0"} err="failed to get container status \"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0\": rpc error: code = NotFound desc = could not find container \"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0\": container with ID starting with a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0 not found: ID does not exist" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.568515 4823 scope.go:117] "RemoveContainer" containerID="1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.568974 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f"} err="failed to get container status \"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f\": rpc error: code = NotFound desc = could not find container \"1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f\": container with ID starting with 1bfc22d7e8a1015ad1b5dc7e7de2a51088055779abb44d0d583af2d6aaf0a09f not found: ID does not exist" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.569012 4823 scope.go:117] "RemoveContainer" containerID="a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.569412 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0"} err="failed to get container status \"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0\": rpc error: code = NotFound desc = could not find container \"a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0\": container with ID starting with a9e35598aef459f13a13ee32eda7bda626f30e551305863abc31b00b4f9577a0 not found: ID does not exist" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.569579 4823 scope.go:117] "RemoveContainer" containerID="41af04c1b1d3d7fba526b58071e84a55f4d77a41c033341e492f0f1febfa391f" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.599112 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.610474 4823 scope.go:117] "RemoveContainer" containerID="5a2c0d9c13987eee26eedb4fa88a845b7ae855811ef7bff152ebec3d6e640b0c" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.627627 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.649297 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-585657f749-s2nbz"] Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.681279 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.700519 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.700997 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" containerName="nova-manage" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701045 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" containerName="nova-manage" Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.701066 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerName="init" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701075 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerName="init" Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.701095 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerName="dnsmasq-dns" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701103 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerName="dnsmasq-dns" Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.701118 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-log" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701125 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-log" Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.701143 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-metadata" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701149 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-metadata" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701388 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-log" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701406 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" containerName="dnsmasq-dns" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701418 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" containerName="nova-metadata-metadata" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.701439 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" containerName="nova-manage" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.702610 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.727488 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-585657f749-s2nbz"] Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.732904 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.735099 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.767901 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.785942 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61e3b1a8_261a_4e24_a70d_7c460c4505bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61e3b1a8_261a_4e24_a70d_7c460c4505bf.slice/crio-197991325a20db40d51ef9e64d5ba1ac2d9eee74ed199d42fc020be136c92d35\": RecentStats: unable to find data in memory cache]" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.786441 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:30 crc kubenswrapper[4823]: E1216 09:01:30.787212 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-lsjgt logs nova-metadata-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-metadata-0" podUID="61faa911-30a8-410f-932f-0d06980df3c9" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.792436 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.793390 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61faa911-30a8-410f-932f-0d06980df3c9-logs\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.793639 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjgt\" (UniqueName: \"kubernetes.io/projected/61faa911-30a8-410f-932f-0d06980df3c9-kube-api-access-lsjgt\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.793757 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-config-data\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.793865 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.895724 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjgt\" (UniqueName: \"kubernetes.io/projected/61faa911-30a8-410f-932f-0d06980df3c9-kube-api-access-lsjgt\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.896354 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-config-data\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.896432 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.896614 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.896952 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61faa911-30a8-410f-932f-0d06980df3c9-logs\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.897962 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61faa911-30a8-410f-932f-0d06980df3c9-logs\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.903665 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-config-data\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.904467 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.905122 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:30 crc kubenswrapper[4823]: I1216 09:01:30.919931 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjgt\" (UniqueName: \"kubernetes.io/projected/61faa911-30a8-410f-932f-0d06980df3c9-kube-api-access-lsjgt\") pod \"nova-metadata-0\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " pod="openstack/nova-metadata-0" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.498759 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.498780 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" containerName="nova-scheduler-scheduler" containerID="cri-o://4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2" gracePeriod=30 Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.498986 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-log" containerID="cri-o://adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807" gracePeriod=30 Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.498986 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-api" containerID="cri-o://8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f" gracePeriod=30 Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.513326 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.609608 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-combined-ca-bundle\") pod \"61faa911-30a8-410f-932f-0d06980df3c9\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.609671 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-config-data\") pod \"61faa911-30a8-410f-932f-0d06980df3c9\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.609791 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-nova-metadata-tls-certs\") pod \"61faa911-30a8-410f-932f-0d06980df3c9\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.609839 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61faa911-30a8-410f-932f-0d06980df3c9-logs\") pod \"61faa911-30a8-410f-932f-0d06980df3c9\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.609975 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsjgt\" (UniqueName: \"kubernetes.io/projected/61faa911-30a8-410f-932f-0d06980df3c9-kube-api-access-lsjgt\") pod \"61faa911-30a8-410f-932f-0d06980df3c9\" (UID: \"61faa911-30a8-410f-932f-0d06980df3c9\") " Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.611998 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61faa911-30a8-410f-932f-0d06980df3c9-logs" (OuterVolumeSpecName: "logs") pod "61faa911-30a8-410f-932f-0d06980df3c9" (UID: "61faa911-30a8-410f-932f-0d06980df3c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.615176 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61faa911-30a8-410f-932f-0d06980df3c9" (UID: "61faa911-30a8-410f-932f-0d06980df3c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.616297 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-config-data" (OuterVolumeSpecName: "config-data") pod "61faa911-30a8-410f-932f-0d06980df3c9" (UID: "61faa911-30a8-410f-932f-0d06980df3c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.619225 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61faa911-30a8-410f-932f-0d06980df3c9-kube-api-access-lsjgt" (OuterVolumeSpecName: "kube-api-access-lsjgt") pod "61faa911-30a8-410f-932f-0d06980df3c9" (UID: "61faa911-30a8-410f-932f-0d06980df3c9"). InnerVolumeSpecName "kube-api-access-lsjgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.620405 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "61faa911-30a8-410f-932f-0d06980df3c9" (UID: "61faa911-30a8-410f-932f-0d06980df3c9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.712538 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsjgt\" (UniqueName: \"kubernetes.io/projected/61faa911-30a8-410f-932f-0d06980df3c9-kube-api-access-lsjgt\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.712588 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.712603 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.712617 4823 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61faa911-30a8-410f-932f-0d06980df3c9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.712629 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61faa911-30a8-410f-932f-0d06980df3c9-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.783892 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1feea07f-4c8b-4c90-8f3f-63e810a7a525" path="/var/lib/kubelet/pods/1feea07f-4c8b-4c90-8f3f-63e810a7a525/volumes" Dec 16 09:01:31 crc kubenswrapper[4823]: I1216 09:01:31.784772 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61e3b1a8-261a-4e24-a70d-7c460c4505bf" path="/var/lib/kubelet/pods/61e3b1a8-261a-4e24-a70d-7c460c4505bf/volumes" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.039649 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xqn74"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.054876 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xqn74"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.175150 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.233302 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-config-data\") pod \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.233357 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fxgq\" (UniqueName: \"kubernetes.io/projected/587d1019-58f8-48c1-98f9-6f1bc724d7f1-kube-api-access-8fxgq\") pod \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.233473 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587d1019-58f8-48c1-98f9-6f1bc724d7f1-logs\") pod \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.233529 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-combined-ca-bundle\") pod \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\" (UID: \"587d1019-58f8-48c1-98f9-6f1bc724d7f1\") " Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.234210 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587d1019-58f8-48c1-98f9-6f1bc724d7f1-logs" (OuterVolumeSpecName: "logs") pod "587d1019-58f8-48c1-98f9-6f1bc724d7f1" (UID: "587d1019-58f8-48c1-98f9-6f1bc724d7f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.238327 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587d1019-58f8-48c1-98f9-6f1bc724d7f1-kube-api-access-8fxgq" (OuterVolumeSpecName: "kube-api-access-8fxgq") pod "587d1019-58f8-48c1-98f9-6f1bc724d7f1" (UID: "587d1019-58f8-48c1-98f9-6f1bc724d7f1"). InnerVolumeSpecName "kube-api-access-8fxgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.259087 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-config-data" (OuterVolumeSpecName: "config-data") pod "587d1019-58f8-48c1-98f9-6f1bc724d7f1" (UID: "587d1019-58f8-48c1-98f9-6f1bc724d7f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.264748 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "587d1019-58f8-48c1-98f9-6f1bc724d7f1" (UID: "587d1019-58f8-48c1-98f9-6f1bc724d7f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.336068 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.336112 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fxgq\" (UniqueName: \"kubernetes.io/projected/587d1019-58f8-48c1-98f9-6f1bc724d7f1-kube-api-access-8fxgq\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.336122 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/587d1019-58f8-48c1-98f9-6f1bc724d7f1-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.336131 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587d1019-58f8-48c1-98f9-6f1bc724d7f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.507738 4823 generic.go:334] "Generic (PLEG): container finished" podID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerID="8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f" exitCode=0 Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.507767 4823 generic.go:334] "Generic (PLEG): container finished" podID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerID="adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807" exitCode=143 Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.507813 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.507813 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.507801 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"587d1019-58f8-48c1-98f9-6f1bc724d7f1","Type":"ContainerDied","Data":"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f"} Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.508715 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"587d1019-58f8-48c1-98f9-6f1bc724d7f1","Type":"ContainerDied","Data":"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807"} Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.508727 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"587d1019-58f8-48c1-98f9-6f1bc724d7f1","Type":"ContainerDied","Data":"86ac600122427487d4470a19585cc096b322271c4bbfbee34be3466ddbf69ddb"} Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.508745 4823 scope.go:117] "RemoveContainer" containerID="8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.527943 4823 scope.go:117] "RemoveContainer" containerID="adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.565187 4823 scope.go:117] "RemoveContainer" containerID="8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f" Dec 16 09:01:32 crc kubenswrapper[4823]: E1216 09:01:32.567429 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f\": container with ID starting with 8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f not found: ID does not exist" containerID="8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.567475 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f"} err="failed to get container status \"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f\": rpc error: code = NotFound desc = could not find container \"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f\": container with ID starting with 8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f not found: ID does not exist" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.567508 4823 scope.go:117] "RemoveContainer" containerID="adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807" Dec 16 09:01:32 crc kubenswrapper[4823]: E1216 09:01:32.568402 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807\": container with ID starting with adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807 not found: ID does not exist" containerID="adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.568430 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807"} err="failed to get container status \"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807\": rpc error: code = NotFound desc = could not find container \"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807\": container with ID starting with adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807 not found: ID does not exist" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.568459 4823 scope.go:117] "RemoveContainer" containerID="8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.569333 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f"} err="failed to get container status \"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f\": rpc error: code = NotFound desc = could not find container \"8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f\": container with ID starting with 8c4cbe96de67497960588a59d2f5eea4de88e347ad2807a5c9da35faf47c518f not found: ID does not exist" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.569359 4823 scope.go:117] "RemoveContainer" containerID="adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.572244 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807"} err="failed to get container status \"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807\": rpc error: code = NotFound desc = could not find container \"adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807\": container with ID starting with adb433c72de1ccca5e8236745430a59be1b1c3ff5fccc23861d70b9f47214807 not found: ID does not exist" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.576336 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.587204 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.599785 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: E1216 09:01:32.600228 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-api" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.600240 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-api" Dec 16 09:01:32 crc kubenswrapper[4823]: E1216 09:01:32.600266 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-log" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.600271 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-log" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.600450 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-api" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.600464 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" containerName="nova-api-log" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.601373 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.621644 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.621694 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.629534 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.629796 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.634203 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.641450 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.641550 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-logs\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.641581 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhccp\" (UniqueName: \"kubernetes.io/projected/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-kube-api-access-vhccp\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.641603 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-config-data\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.641624 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.652333 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.654141 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.656789 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.668317 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742582 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-logs\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742633 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj4xt\" (UniqueName: \"kubernetes.io/projected/860c91ee-4dcd-4a5e-abcb-325b31341951-kube-api-access-dj4xt\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742670 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhccp\" (UniqueName: \"kubernetes.io/projected/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-kube-api-access-vhccp\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742696 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-config-data\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742720 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742751 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742851 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742884 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860c91ee-4dcd-4a5e-abcb-325b31341951-logs\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.742930 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-config-data\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.743162 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-logs\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.747429 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-config-data\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.747456 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.748462 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.758077 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhccp\" (UniqueName: \"kubernetes.io/projected/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-kube-api-access-vhccp\") pod \"nova-metadata-0\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.845183 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860c91ee-4dcd-4a5e-abcb-325b31341951-logs\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.845286 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-config-data\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.845931 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj4xt\" (UniqueName: \"kubernetes.io/projected/860c91ee-4dcd-4a5e-abcb-325b31341951-kube-api-access-dj4xt\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.845983 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.846393 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860c91ee-4dcd-4a5e-abcb-325b31341951-logs\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.850833 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-config-data\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.850908 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.867581 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj4xt\" (UniqueName: \"kubernetes.io/projected/860c91ee-4dcd-4a5e-abcb-325b31341951-kube-api-access-dj4xt\") pod \"nova-api-0\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " pod="openstack/nova-api-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.947648 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:01:32 crc kubenswrapper[4823]: I1216 09:01:32.972566 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.428444 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.531342 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.543783 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026","Type":"ContainerStarted","Data":"4757fbb94fc41a5ffdf50d4116680bfecf9daa4921765d1f27f823c8f70b0ef8"} Dec 16 09:01:33 crc kubenswrapper[4823]: W1216 09:01:33.545863 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod860c91ee_4dcd_4a5e_abcb_325b31341951.slice/crio-a0ebf7a4388868fd466095fb0cce13791ad858ebde2c6e17ebf714239860e69c WatchSource:0}: Error finding container a0ebf7a4388868fd466095fb0cce13791ad858ebde2c6e17ebf714239860e69c: Status 404 returned error can't find the container with id a0ebf7a4388868fd466095fb0cce13791ad858ebde2c6e17ebf714239860e69c Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.550046 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.786908 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8" path="/var/lib/kubelet/pods/43e373e7-9d6c-44b2-86c5-3ed6e2c5b6d8/volumes" Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.788630 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587d1019-58f8-48c1-98f9-6f1bc724d7f1" path="/var/lib/kubelet/pods/587d1019-58f8-48c1-98f9-6f1bc724d7f1/volumes" Dec 16 09:01:33 crc kubenswrapper[4823]: I1216 09:01:33.789738 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61faa911-30a8-410f-932f-0d06980df3c9" path="/var/lib/kubelet/pods/61faa911-30a8-410f-932f-0d06980df3c9/volumes" Dec 16 09:01:34 crc kubenswrapper[4823]: I1216 09:01:34.557473 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026","Type":"ContainerStarted","Data":"76cfd7427b391b74b49e284f41e6deeef50c653e65e3743f36a051a6d001ab71"} Dec 16 09:01:34 crc kubenswrapper[4823]: I1216 09:01:34.560070 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"860c91ee-4dcd-4a5e-abcb-325b31341951","Type":"ContainerStarted","Data":"9db6a3d18d3fd021fa6186a76ddd9003796fbf7d1d269a9473a51744b3077d51"} Dec 16 09:01:34 crc kubenswrapper[4823]: I1216 09:01:34.560121 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"860c91ee-4dcd-4a5e-abcb-325b31341951","Type":"ContainerStarted","Data":"a0ebf7a4388868fd466095fb0cce13791ad858ebde2c6e17ebf714239860e69c"} Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.570777 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026","Type":"ContainerStarted","Data":"3059c456084fc689951e3eb5aee2dfc53406f37d52d93a1078de1666042d7578"} Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.576364 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"860c91ee-4dcd-4a5e-abcb-325b31341951","Type":"ContainerStarted","Data":"3973cc71f5ebd11807529940ab4ed991048dc0d85b38c7bc4a76cdcb2ea7c145"} Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.603397 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.6033797869999997 podStartE2EDuration="3.603379787s" podCreationTimestamp="2025-12-16 09:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:35.588616504 +0000 UTC m=+7574.077182647" watchObservedRunningTime="2025-12-16 09:01:35.603379787 +0000 UTC m=+7574.091945910" Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.621688 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.62166743 podStartE2EDuration="3.62166743s" podCreationTimestamp="2025-12-16 09:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:35.617707745 +0000 UTC m=+7574.106273878" watchObservedRunningTime="2025-12-16 09:01:35.62166743 +0000 UTC m=+7574.110233553" Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.728838 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.796185 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:35 crc kubenswrapper[4823]: I1216 09:01:35.969355 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k9mtc"] Dec 16 09:01:37 crc kubenswrapper[4823]: I1216 09:01:37.601075 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k9mtc" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="registry-server" containerID="cri-o://eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32" gracePeriod=2 Dec 16 09:01:37 crc kubenswrapper[4823]: I1216 09:01:37.948298 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 09:01:37 crc kubenswrapper[4823]: I1216 09:01:37.950054 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.137790 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.182057 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-utilities\") pod \"1989b143-cd57-41c6-9174-e8067cbc491f\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.182114 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-catalog-content\") pod \"1989b143-cd57-41c6-9174-e8067cbc491f\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.182144 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9nxz\" (UniqueName: \"kubernetes.io/projected/1989b143-cd57-41c6-9174-e8067cbc491f-kube-api-access-w9nxz\") pod \"1989b143-cd57-41c6-9174-e8067cbc491f\" (UID: \"1989b143-cd57-41c6-9174-e8067cbc491f\") " Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.189976 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-utilities" (OuterVolumeSpecName: "utilities") pod "1989b143-cd57-41c6-9174-e8067cbc491f" (UID: "1989b143-cd57-41c6-9174-e8067cbc491f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.200302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1989b143-cd57-41c6-9174-e8067cbc491f-kube-api-access-w9nxz" (OuterVolumeSpecName: "kube-api-access-w9nxz") pod "1989b143-cd57-41c6-9174-e8067cbc491f" (UID: "1989b143-cd57-41c6-9174-e8067cbc491f"). InnerVolumeSpecName "kube-api-access-w9nxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.237860 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1989b143-cd57-41c6-9174-e8067cbc491f" (UID: "1989b143-cd57-41c6-9174-e8067cbc491f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.285036 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.285086 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1989b143-cd57-41c6-9174-e8067cbc491f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.285108 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9nxz\" (UniqueName: \"kubernetes.io/projected/1989b143-cd57-41c6-9174-e8067cbc491f-kube-api-access-w9nxz\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.611744 4823 generic.go:334] "Generic (PLEG): container finished" podID="1989b143-cd57-41c6-9174-e8067cbc491f" containerID="eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32" exitCode=0 Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.611807 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerDied","Data":"eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32"} Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.611832 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k9mtc" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.611859 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k9mtc" event={"ID":"1989b143-cd57-41c6-9174-e8067cbc491f","Type":"ContainerDied","Data":"73e86d437fa925e89e08a7e70533b34054a6848460c20f8bd456bd549c0e8ed5"} Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.611886 4823 scope.go:117] "RemoveContainer" containerID="eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.636692 4823 scope.go:117] "RemoveContainer" containerID="343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.654862 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k9mtc"] Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.664673 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k9mtc"] Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.679563 4823 scope.go:117] "RemoveContainer" containerID="3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.717243 4823 scope.go:117] "RemoveContainer" containerID="eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32" Dec 16 09:01:38 crc kubenswrapper[4823]: E1216 09:01:38.717776 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32\": container with ID starting with eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32 not found: ID does not exist" containerID="eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.717813 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32"} err="failed to get container status \"eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32\": rpc error: code = NotFound desc = could not find container \"eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32\": container with ID starting with eb437857393757b20d324f42c526c2b0733370c1b923df4e4f83ff28567a0a32 not found: ID does not exist" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.717838 4823 scope.go:117] "RemoveContainer" containerID="343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc" Dec 16 09:01:38 crc kubenswrapper[4823]: E1216 09:01:38.718272 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc\": container with ID starting with 343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc not found: ID does not exist" containerID="343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.718343 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc"} err="failed to get container status \"343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc\": rpc error: code = NotFound desc = could not find container \"343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc\": container with ID starting with 343390cb1b9dbe46a6225176839bbd7fbb8d99803c2693939b4e74634fac34cc not found: ID does not exist" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.718378 4823 scope.go:117] "RemoveContainer" containerID="3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f" Dec 16 09:01:38 crc kubenswrapper[4823]: E1216 09:01:38.718729 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f\": container with ID starting with 3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f not found: ID does not exist" containerID="3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.718763 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f"} err="failed to get container status \"3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f\": rpc error: code = NotFound desc = could not find container \"3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f\": container with ID starting with 3316eb77edd9ac9ecbe03a940e26d7211ec411fe27b5e5614a92680b1439b37f not found: ID does not exist" Dec 16 09:01:38 crc kubenswrapper[4823]: I1216 09:01:38.821324 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 09:01:39 crc kubenswrapper[4823]: I1216 09:01:39.785742 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" path="/var/lib/kubelet/pods/1989b143-cd57-41c6-9174-e8067cbc491f/volumes" Dec 16 09:01:42 crc kubenswrapper[4823]: I1216 09:01:42.948155 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 09:01:42 crc kubenswrapper[4823]: I1216 09:01:42.948212 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 09:01:42 crc kubenswrapper[4823]: I1216 09:01:42.973710 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:01:42 crc kubenswrapper[4823]: I1216 09:01:42.973806 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:01:43 crc kubenswrapper[4823]: I1216 09:01:43.965380 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:43 crc kubenswrapper[4823]: I1216 09:01:43.965390 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:44 crc kubenswrapper[4823]: I1216 09:01:44.055333 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:44 crc kubenswrapper[4823]: I1216 09:01:44.055596 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:46 crc kubenswrapper[4823]: I1216 09:01:46.033313 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tjbg9"] Dec 16 09:01:46 crc kubenswrapper[4823]: I1216 09:01:46.075106 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tjbg9"] Dec 16 09:01:47 crc kubenswrapper[4823]: I1216 09:01:47.785327 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6dd2ee-cc9d-4aed-994a-59021ba71f47" path="/var/lib/kubelet/pods/fb6dd2ee-cc9d-4aed-994a-59021ba71f47/volumes" Dec 16 09:01:49 crc kubenswrapper[4823]: I1216 09:01:49.977380 4823 scope.go:117] "RemoveContainer" containerID="a76d0259f26ac7f214ff187762c1d93800cea4b144dae321378712aba87a7fd6" Dec 16 09:01:50 crc kubenswrapper[4823]: I1216 09:01:50.026687 4823 scope.go:117] "RemoveContainer" containerID="4ec0f7c2e134442b675d08bc3e4fc7f0885c233878469d1da39df7e1df12e44f" Dec 16 09:01:50 crc kubenswrapper[4823]: I1216 09:01:50.054113 4823 scope.go:117] "RemoveContainer" containerID="26c7db70f6316543ff12cfb682c93e0642226e2bffa92bcddd444bcecc4ea3f0" Dec 16 09:01:50 crc kubenswrapper[4823]: I1216 09:01:50.098706 4823 scope.go:117] "RemoveContainer" containerID="5414e83a96e344b0a96036db40557c25edcf33693ffa4f01634717a9f4f1781f" Dec 16 09:01:53 crc kubenswrapper[4823]: I1216 09:01:53.957174 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:53 crc kubenswrapper[4823]: I1216 09:01:53.957246 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:54 crc kubenswrapper[4823]: I1216 09:01:54.057228 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:54 crc kubenswrapper[4823]: I1216 09:01:54.057396 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:01:55 crc kubenswrapper[4823]: I1216 09:01:55.779845 4823 generic.go:334] "Generic (PLEG): container finished" podID="35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" containerID="dbe718cd84e6bce36c7fc414bebbd801211bd5d70b48892ad3e8b383788cc6e2" exitCode=137 Dec 16 09:01:55 crc kubenswrapper[4823]: I1216 09:01:55.781876 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5","Type":"ContainerDied","Data":"dbe718cd84e6bce36c7fc414bebbd801211bd5d70b48892ad3e8b383788cc6e2"} Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.097946 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.179140 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-combined-ca-bundle\") pod \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.179221 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-config-data\") pod \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.179265 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x72q\" (UniqueName: \"kubernetes.io/projected/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-kube-api-access-2x72q\") pod \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\" (UID: \"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5\") " Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.187060 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-kube-api-access-2x72q" (OuterVolumeSpecName: "kube-api-access-2x72q") pod "35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" (UID: "35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5"). InnerVolumeSpecName "kube-api-access-2x72q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.210201 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" (UID: "35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.211142 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-config-data" (OuterVolumeSpecName: "config-data") pod "35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" (UID: "35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.281920 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.281967 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.281983 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x72q\" (UniqueName: \"kubernetes.io/projected/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5-kube-api-access-2x72q\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.791968 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5","Type":"ContainerDied","Data":"5eafa09d6164dfaf5d1a6e135e7c5069c60ebe1743b6df880612ebf7d3f807e6"} Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.792033 4823 scope.go:117] "RemoveContainer" containerID="dbe718cd84e6bce36c7fc414bebbd801211bd5d70b48892ad3e8b383788cc6e2" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.792093 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.834516 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.842583 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.867632 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:56 crc kubenswrapper[4823]: E1216 09:01:56.868189 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="registry-server" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.868210 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="registry-server" Dec 16 09:01:56 crc kubenswrapper[4823]: E1216 09:01:56.868219 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.868230 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 09:01:56 crc kubenswrapper[4823]: E1216 09:01:56.868247 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="extract-content" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.868256 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="extract-content" Dec 16 09:01:56 crc kubenswrapper[4823]: E1216 09:01:56.868269 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="extract-utilities" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.868277 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="extract-utilities" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.868477 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1989b143-cd57-41c6-9174-e8067cbc491f" containerName="registry-server" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.868497 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.869316 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.871452 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.872042 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.872274 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.877055 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.996745 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.996827 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.996884 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.996909 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2jc\" (UniqueName: \"kubernetes.io/projected/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-kube-api-access-fj2jc\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:56 crc kubenswrapper[4823]: I1216 09:01:56.997013 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.098884 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.099005 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.099112 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.099173 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.099199 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2jc\" (UniqueName: \"kubernetes.io/projected/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-kube-api-access-fj2jc\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.102950 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.103111 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.103842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.110140 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.117819 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2jc\" (UniqueName: \"kubernetes.io/projected/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-kube-api-access-fj2jc\") pod \"nova-cell1-novncproxy-0\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.195402 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.661740 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.783670 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5" path="/var/lib/kubelet/pods/35a08b79-3be2-4b2b-b41e-bb87c1f2e0d5/volumes" Dec 16 09:01:57 crc kubenswrapper[4823]: I1216 09:01:57.806096 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"868b7d1a-5039-4d72-9a41-d8e57b1df5d4","Type":"ContainerStarted","Data":"28f41dbbcfe435b105f27ca14226ca13d0adc0ff95bbc1d708807284cd33f631"} Dec 16 09:01:58 crc kubenswrapper[4823]: I1216 09:01:58.857344 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"868b7d1a-5039-4d72-9a41-d8e57b1df5d4","Type":"ContainerStarted","Data":"e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f"} Dec 16 09:01:58 crc kubenswrapper[4823]: I1216 09:01:58.892159 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.892133843 podStartE2EDuration="2.892133843s" podCreationTimestamp="2025-12-16 09:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:58.888105948 +0000 UTC m=+7597.376672091" watchObservedRunningTime="2025-12-16 09:01:58.892133843 +0000 UTC m=+7597.380699966" Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.870621 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.885426 4823 generic.go:334] "Generic (PLEG): container finished" podID="63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" containerID="4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2" exitCode=137 Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.885755 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e","Type":"ContainerDied","Data":"4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2"} Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.885817 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e","Type":"ContainerDied","Data":"7d6d85777ccf63c7aa3179a7d444a8d776eb5409749bf97b72bb553740b1ef24"} Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.885845 4823 scope.go:117] "RemoveContainer" containerID="4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2" Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.885865 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.929239 4823 scope.go:117] "RemoveContainer" containerID="4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2" Dec 16 09:02:01 crc kubenswrapper[4823]: E1216 09:02:01.935543 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2\": container with ID starting with 4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2 not found: ID does not exist" containerID="4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2" Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.935615 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2"} err="failed to get container status \"4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2\": rpc error: code = NotFound desc = could not find container \"4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2\": container with ID starting with 4758f43e184672ce4a603509d692a538d609cb8b49c36f6effcc5226089bcbd2 not found: ID does not exist" Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.997806 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-config-data\") pod \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.997881 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2vpr\" (UniqueName: \"kubernetes.io/projected/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-kube-api-access-n2vpr\") pod \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " Dec 16 09:02:01 crc kubenswrapper[4823]: I1216 09:02:01.997979 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-combined-ca-bundle\") pod \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\" (UID: \"63e56d7b-61f5-4bd7-bac6-48f147bf1b2e\") " Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.003109 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-kube-api-access-n2vpr" (OuterVolumeSpecName: "kube-api-access-n2vpr") pod "63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" (UID: "63e56d7b-61f5-4bd7-bac6-48f147bf1b2e"). InnerVolumeSpecName "kube-api-access-n2vpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.023798 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-config-data" (OuterVolumeSpecName: "config-data") pod "63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" (UID: "63e56d7b-61f5-4bd7-bac6-48f147bf1b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.024894 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" (UID: "63e56d7b-61f5-4bd7-bac6-48f147bf1b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.100215 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.100255 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2vpr\" (UniqueName: \"kubernetes.io/projected/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-kube-api-access-n2vpr\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.100270 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.195610 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.238583 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.253820 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.262470 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:02 crc kubenswrapper[4823]: E1216 09:02:02.263075 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" containerName="nova-scheduler-scheduler" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.263101 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" containerName="nova-scheduler-scheduler" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.263357 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" containerName="nova-scheduler-scheduler" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.264262 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.267268 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.271754 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.302490 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.302582 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.302908 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4sgn\" (UniqueName: \"kubernetes.io/projected/5dcc60af-b15a-4362-8b2a-378fbdff02e0-kube-api-access-d4sgn\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.404671 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.404865 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4sgn\" (UniqueName: \"kubernetes.io/projected/5dcc60af-b15a-4362-8b2a-378fbdff02e0-kube-api-access-d4sgn\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.404968 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.408617 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.408676 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.420592 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4sgn\" (UniqueName: \"kubernetes.io/projected/5dcc60af-b15a-4362-8b2a-378fbdff02e0-kube-api-access-d4sgn\") pod \"nova-scheduler-0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.619594 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.973554 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:02:02 crc kubenswrapper[4823]: I1216 09:02:02.974561 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.073003 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.782579 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e56d7b-61f5-4bd7-bac6-48f147bf1b2e" path="/var/lib/kubelet/pods/63e56d7b-61f5-4bd7-bac6-48f147bf1b2e/volumes" Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.906975 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5dcc60af-b15a-4362-8b2a-378fbdff02e0","Type":"ContainerStarted","Data":"93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea"} Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.907041 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5dcc60af-b15a-4362-8b2a-378fbdff02e0","Type":"ContainerStarted","Data":"e4bca8c4650f70bd11608528bb6fc9e32073c57e176d666ada681d06cbe0c928"} Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.925964 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.925889687 podStartE2EDuration="1.925889687s" podCreationTimestamp="2025-12-16 09:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:02:03.923583615 +0000 UTC m=+7602.412149748" watchObservedRunningTime="2025-12-16 09:02:03.925889687 +0000 UTC m=+7602.414455810" Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.953278 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:03 crc kubenswrapper[4823]: I1216 09:02:03.961227 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:04 crc kubenswrapper[4823]: I1216 09:02:04.056231 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:04 crc kubenswrapper[4823]: I1216 09:02:04.056552 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:07 crc kubenswrapper[4823]: I1216 09:02:07.196640 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:02:07 crc kubenswrapper[4823]: I1216 09:02:07.228178 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:02:07 crc kubenswrapper[4823]: I1216 09:02:07.620667 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 09:02:07 crc kubenswrapper[4823]: I1216 09:02:07.963548 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.145866 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vn5cc"] Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.148184 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.157772 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.167484 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.169956 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vn5cc"] Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.242585 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzqz8\" (UniqueName: \"kubernetes.io/projected/5e34f974-7d80-436a-b8e6-68faa3b7db70-kube-api-access-kzqz8\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.242635 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-scripts\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.242659 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-config-data\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.242741 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.344448 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-config-data\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.344600 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.344720 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzqz8\" (UniqueName: \"kubernetes.io/projected/5e34f974-7d80-436a-b8e6-68faa3b7db70-kube-api-access-kzqz8\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.344756 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-scripts\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.356809 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-scripts\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.357259 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-config-data\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.358808 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.372652 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzqz8\" (UniqueName: \"kubernetes.io/projected/5e34f974-7d80-436a-b8e6-68faa3b7db70-kube-api-access-kzqz8\") pod \"nova-cell1-cell-mapping-vn5cc\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.474790 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.933078 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vn5cc"] Dec 16 09:02:08 crc kubenswrapper[4823]: I1216 09:02:08.956944 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vn5cc" event={"ID":"5e34f974-7d80-436a-b8e6-68faa3b7db70","Type":"ContainerStarted","Data":"9516d5b494a78e3de2ad001a8ea3c7137fc63e5a6b34cb540894f3d424bc9f99"} Dec 16 09:02:09 crc kubenswrapper[4823]: I1216 09:02:09.966788 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vn5cc" event={"ID":"5e34f974-7d80-436a-b8e6-68faa3b7db70","Type":"ContainerStarted","Data":"f591fd2607a4fd35339882b147b37bb251be764b4a5f7303532620e154218301"} Dec 16 09:02:12 crc kubenswrapper[4823]: I1216 09:02:12.620423 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 09:02:12 crc kubenswrapper[4823]: I1216 09:02:12.658706 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 09:02:12 crc kubenswrapper[4823]: I1216 09:02:12.697420 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vn5cc" podStartSLOduration=4.697399033 podStartE2EDuration="4.697399033s" podCreationTimestamp="2025-12-16 09:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:02:09.992679792 +0000 UTC m=+7608.481245915" watchObservedRunningTime="2025-12-16 09:02:12.697399033 +0000 UTC m=+7611.185965156" Dec 16 09:02:13 crc kubenswrapper[4823]: I1216 09:02:13.026318 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 09:02:13 crc kubenswrapper[4823]: I1216 09:02:13.955169 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:13 crc kubenswrapper[4823]: I1216 09:02:13.955190 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:14 crc kubenswrapper[4823]: I1216 09:02:14.055307 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:14 crc kubenswrapper[4823]: I1216 09:02:14.055653 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.96:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:15 crc kubenswrapper[4823]: I1216 09:02:15.017696 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e34f974-7d80-436a-b8e6-68faa3b7db70" containerID="f591fd2607a4fd35339882b147b37bb251be764b4a5f7303532620e154218301" exitCode=0 Dec 16 09:02:15 crc kubenswrapper[4823]: I1216 09:02:15.017741 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vn5cc" event={"ID":"5e34f974-7d80-436a-b8e6-68faa3b7db70","Type":"ContainerDied","Data":"f591fd2607a4fd35339882b147b37bb251be764b4a5f7303532620e154218301"} Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.390904 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.553215 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-scripts\") pod \"5e34f974-7d80-436a-b8e6-68faa3b7db70\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.553377 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-combined-ca-bundle\") pod \"5e34f974-7d80-436a-b8e6-68faa3b7db70\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.553515 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-config-data\") pod \"5e34f974-7d80-436a-b8e6-68faa3b7db70\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.553576 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzqz8\" (UniqueName: \"kubernetes.io/projected/5e34f974-7d80-436a-b8e6-68faa3b7db70-kube-api-access-kzqz8\") pod \"5e34f974-7d80-436a-b8e6-68faa3b7db70\" (UID: \"5e34f974-7d80-436a-b8e6-68faa3b7db70\") " Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.559255 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-scripts" (OuterVolumeSpecName: "scripts") pod "5e34f974-7d80-436a-b8e6-68faa3b7db70" (UID: "5e34f974-7d80-436a-b8e6-68faa3b7db70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.562286 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e34f974-7d80-436a-b8e6-68faa3b7db70-kube-api-access-kzqz8" (OuterVolumeSpecName: "kube-api-access-kzqz8") pod "5e34f974-7d80-436a-b8e6-68faa3b7db70" (UID: "5e34f974-7d80-436a-b8e6-68faa3b7db70"). InnerVolumeSpecName "kube-api-access-kzqz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.579209 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e34f974-7d80-436a-b8e6-68faa3b7db70" (UID: "5e34f974-7d80-436a-b8e6-68faa3b7db70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.583676 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-config-data" (OuterVolumeSpecName: "config-data") pod "5e34f974-7d80-436a-b8e6-68faa3b7db70" (UID: "5e34f974-7d80-436a-b8e6-68faa3b7db70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.655738 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzqz8\" (UniqueName: \"kubernetes.io/projected/5e34f974-7d80-436a-b8e6-68faa3b7db70-kube-api-access-kzqz8\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.655773 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.655805 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:16 crc kubenswrapper[4823]: I1216 09:02:16.655815 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e34f974-7d80-436a-b8e6-68faa3b7db70-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.039130 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vn5cc" event={"ID":"5e34f974-7d80-436a-b8e6-68faa3b7db70","Type":"ContainerDied","Data":"9516d5b494a78e3de2ad001a8ea3c7137fc63e5a6b34cb540894f3d424bc9f99"} Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.039426 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9516d5b494a78e3de2ad001a8ea3c7137fc63e5a6b34cb540894f3d424bc9f99" Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.039221 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vn5cc" Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.302687 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.302940 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" containerID="cri-o://93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" gracePeriod=30 Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.357676 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.357984 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" containerID="cri-o://76cfd7427b391b74b49e284f41e6deeef50c653e65e3743f36a051a6d001ab71" gracePeriod=30 Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.358082 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" containerID="cri-o://3059c456084fc689951e3eb5aee2dfc53406f37d52d93a1078de1666042d7578" gracePeriod=30 Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.373850 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.374180 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" containerID="cri-o://9db6a3d18d3fd021fa6186a76ddd9003796fbf7d1d269a9473a51744b3077d51" gracePeriod=30 Dec 16 09:02:17 crc kubenswrapper[4823]: I1216 09:02:17.374268 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" containerID="cri-o://3973cc71f5ebd11807529940ab4ed991048dc0d85b38c7bc4a76cdcb2ea7c145" gracePeriod=30 Dec 16 09:02:17 crc kubenswrapper[4823]: E1216 09:02:17.622322 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:17 crc kubenswrapper[4823]: E1216 09:02:17.624867 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:17 crc kubenswrapper[4823]: E1216 09:02:17.629329 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:17 crc kubenswrapper[4823]: E1216 09:02:17.629383 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:18 crc kubenswrapper[4823]: I1216 09:02:18.056669 4823 generic.go:334] "Generic (PLEG): container finished" podID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerID="76cfd7427b391b74b49e284f41e6deeef50c653e65e3743f36a051a6d001ab71" exitCode=143 Dec 16 09:02:18 crc kubenswrapper[4823]: I1216 09:02:18.056736 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026","Type":"ContainerDied","Data":"76cfd7427b391b74b49e284f41e6deeef50c653e65e3743f36a051a6d001ab71"} Dec 16 09:02:18 crc kubenswrapper[4823]: I1216 09:02:18.058943 4823 generic.go:334] "Generic (PLEG): container finished" podID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerID="9db6a3d18d3fd021fa6186a76ddd9003796fbf7d1d269a9473a51744b3077d51" exitCode=143 Dec 16 09:02:18 crc kubenswrapper[4823]: I1216 09:02:18.058969 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"860c91ee-4dcd-4a5e-abcb-325b31341951","Type":"ContainerDied","Data":"9db6a3d18d3fd021fa6186a76ddd9003796fbf7d1d269a9473a51744b3077d51"} Dec 16 09:02:22 crc kubenswrapper[4823]: E1216 09:02:22.622307 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:22 crc kubenswrapper[4823]: E1216 09:02:22.625773 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:22 crc kubenswrapper[4823]: E1216 09:02:22.627473 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:22 crc kubenswrapper[4823]: E1216 09:02:22.627535 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:27 crc kubenswrapper[4823]: E1216 09:02:27.621780 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:27 crc kubenswrapper[4823]: E1216 09:02:27.623453 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:27 crc kubenswrapper[4823]: E1216 09:02:27.624521 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:27 crc kubenswrapper[4823]: E1216 09:02:27.624560 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.320493 4823 generic.go:334] "Generic (PLEG): container finished" podID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerID="3973cc71f5ebd11807529940ab4ed991048dc0d85b38c7bc4a76cdcb2ea7c145" exitCode=0 Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.320568 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"860c91ee-4dcd-4a5e-abcb-325b31341951","Type":"ContainerDied","Data":"3973cc71f5ebd11807529940ab4ed991048dc0d85b38c7bc4a76cdcb2ea7c145"} Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.323166 4823 generic.go:334] "Generic (PLEG): container finished" podID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerID="3059c456084fc689951e3eb5aee2dfc53406f37d52d93a1078de1666042d7578" exitCode=0 Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.323191 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026","Type":"ContainerDied","Data":"3059c456084fc689951e3eb5aee2dfc53406f37d52d93a1078de1666042d7578"} Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.323208 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026","Type":"ContainerDied","Data":"4757fbb94fc41a5ffdf50d4116680bfecf9daa4921765d1f27f823c8f70b0ef8"} Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.323219 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4757fbb94fc41a5ffdf50d4116680bfecf9daa4921765d1f27f823c8f70b0ef8" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.359742 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.463569 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.543138 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-config-data\") pod \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.543942 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-nova-metadata-tls-certs\") pod \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.543990 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhccp\" (UniqueName: \"kubernetes.io/projected/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-kube-api-access-vhccp\") pod \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.544052 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-logs\") pod \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.544178 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-combined-ca-bundle\") pod \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\" (UID: \"aabc53f0-5705-4d7f-bc7c-0b1fcfd75026\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.544895 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-logs" (OuterVolumeSpecName: "logs") pod "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" (UID: "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.552015 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-kube-api-access-vhccp" (OuterVolumeSpecName: "kube-api-access-vhccp") pod "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" (UID: "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026"). InnerVolumeSpecName "kube-api-access-vhccp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.572324 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" (UID: "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.575357 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-config-data" (OuterVolumeSpecName: "config-data") pod "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" (UID: "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.604321 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" (UID: "aabc53f0-5705-4d7f-bc7c-0b1fcfd75026"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.647174 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860c91ee-4dcd-4a5e-abcb-325b31341951-logs\") pod \"860c91ee-4dcd-4a5e-abcb-325b31341951\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.647502 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-config-data\") pod \"860c91ee-4dcd-4a5e-abcb-325b31341951\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.647549 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj4xt\" (UniqueName: \"kubernetes.io/projected/860c91ee-4dcd-4a5e-abcb-325b31341951-kube-api-access-dj4xt\") pod \"860c91ee-4dcd-4a5e-abcb-325b31341951\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.647575 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle\") pod \"860c91ee-4dcd-4a5e-abcb-325b31341951\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.647652 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860c91ee-4dcd-4a5e-abcb-325b31341951-logs" (OuterVolumeSpecName: "logs") pod "860c91ee-4dcd-4a5e-abcb-325b31341951" (UID: "860c91ee-4dcd-4a5e-abcb-325b31341951"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.648408 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.648436 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.648452 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/860c91ee-4dcd-4a5e-abcb-325b31341951-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.648464 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.648476 4823 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.648488 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhccp\" (UniqueName: \"kubernetes.io/projected/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026-kube-api-access-vhccp\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.650382 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860c91ee-4dcd-4a5e-abcb-325b31341951-kube-api-access-dj4xt" (OuterVolumeSpecName: "kube-api-access-dj4xt") pod "860c91ee-4dcd-4a5e-abcb-325b31341951" (UID: "860c91ee-4dcd-4a5e-abcb-325b31341951"). InnerVolumeSpecName "kube-api-access-dj4xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: E1216 09:02:31.672967 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle podName:860c91ee-4dcd-4a5e-abcb-325b31341951 nodeName:}" failed. No retries permitted until 2025-12-16 09:02:32.172939207 +0000 UTC m=+7630.661505330 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle") pod "860c91ee-4dcd-4a5e-abcb-325b31341951" (UID: "860c91ee-4dcd-4a5e-abcb-325b31341951") : error deleting /var/lib/kubelet/pods/860c91ee-4dcd-4a5e-abcb-325b31341951/volume-subpaths: remove /var/lib/kubelet/pods/860c91ee-4dcd-4a5e-abcb-325b31341951/volume-subpaths: no such file or directory Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.675661 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-config-data" (OuterVolumeSpecName: "config-data") pod "860c91ee-4dcd-4a5e-abcb-325b31341951" (UID: "860c91ee-4dcd-4a5e-abcb-325b31341951"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.749623 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:31 crc kubenswrapper[4823]: I1216 09:02:31.749660 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj4xt\" (UniqueName: \"kubernetes.io/projected/860c91ee-4dcd-4a5e-abcb-325b31341951-kube-api-access-dj4xt\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.264055 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle\") pod \"860c91ee-4dcd-4a5e-abcb-325b31341951\" (UID: \"860c91ee-4dcd-4a5e-abcb-325b31341951\") " Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.267171 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "860c91ee-4dcd-4a5e-abcb-325b31341951" (UID: "860c91ee-4dcd-4a5e-abcb-325b31341951"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.334755 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.334813 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"860c91ee-4dcd-4a5e-abcb-325b31341951","Type":"ContainerDied","Data":"a0ebf7a4388868fd466095fb0cce13791ad858ebde2c6e17ebf714239860e69c"} Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.334847 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.334945 4823 scope.go:117] "RemoveContainer" containerID="3973cc71f5ebd11807529940ab4ed991048dc0d85b38c7bc4a76cdcb2ea7c145" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.366160 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.366559 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860c91ee-4dcd-4a5e-abcb-325b31341951-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.373408 4823 scope.go:117] "RemoveContainer" containerID="9db6a3d18d3fd021fa6186a76ddd9003796fbf7d1d269a9473a51744b3077d51" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.374098 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.402882 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.415295 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.425707 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.426252 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426277 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.426296 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34f974-7d80-436a-b8e6-68faa3b7db70" containerName="nova-manage" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426304 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34f974-7d80-436a-b8e6-68faa3b7db70" containerName="nova-manage" Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.426348 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426356 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.426367 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426375 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.426387 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426410 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426642 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-log" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426667 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-api" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426681 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" containerName="nova-api-log" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426693 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" containerName="nova-metadata-metadata" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.426719 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e34f974-7d80-436a-b8e6-68faa3b7db70" containerName="nova-manage" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.428001 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.435400 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.437580 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.441621 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.441958 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.442159 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.445984 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.464105 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570173 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-logs\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570257 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570294 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zqk\" (UniqueName: \"kubernetes.io/projected/c5dadf1e-5652-4840-a8f8-985860981c4f-kube-api-access-s7zqk\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570338 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5dadf1e-5652-4840-a8f8-985860981c4f-logs\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570525 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570563 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570725 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-config-data\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570806 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rf9l\" (UniqueName: \"kubernetes.io/projected/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-kube-api-access-8rf9l\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.570865 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-config-data\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.622401 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.623678 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.624971 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:32 crc kubenswrapper[4823]: E1216 09:02:32.625014 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672411 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-config-data\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672488 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rf9l\" (UniqueName: \"kubernetes.io/projected/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-kube-api-access-8rf9l\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672561 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-config-data\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672612 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-logs\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672641 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672669 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zqk\" (UniqueName: \"kubernetes.io/projected/c5dadf1e-5652-4840-a8f8-985860981c4f-kube-api-access-s7zqk\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672724 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5dadf1e-5652-4840-a8f8-985860981c4f-logs\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672790 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.672814 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.673153 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-logs\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.673476 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5dadf1e-5652-4840-a8f8-985860981c4f-logs\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.677112 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.677169 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.677257 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-config-data\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.677724 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-config-data\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.686601 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.694609 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rf9l\" (UniqueName: \"kubernetes.io/projected/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-kube-api-access-8rf9l\") pod \"nova-metadata-0\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.700618 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zqk\" (UniqueName: \"kubernetes.io/projected/c5dadf1e-5652-4840-a8f8-985860981c4f-kube-api-access-s7zqk\") pod \"nova-api-0\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " pod="openstack/nova-api-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.755985 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:02:32 crc kubenswrapper[4823]: I1216 09:02:32.766602 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:02:33 crc kubenswrapper[4823]: I1216 09:02:33.236611 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:02:33 crc kubenswrapper[4823]: I1216 09:02:33.351582 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a","Type":"ContainerStarted","Data":"e97823a6994d76dcbaa45e655367561d28af036bedcffbbf1fdcd8c984ca3f0d"} Dec 16 09:02:33 crc kubenswrapper[4823]: I1216 09:02:33.384964 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:33 crc kubenswrapper[4823]: W1216 09:02:33.387585 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5dadf1e_5652_4840_a8f8_985860981c4f.slice/crio-d56c1596f7d4bdcf7e3eec92e619264bfc736c9ff2444ddcb8eb5cc6cc8ddeb2 WatchSource:0}: Error finding container d56c1596f7d4bdcf7e3eec92e619264bfc736c9ff2444ddcb8eb5cc6cc8ddeb2: Status 404 returned error can't find the container with id d56c1596f7d4bdcf7e3eec92e619264bfc736c9ff2444ddcb8eb5cc6cc8ddeb2 Dec 16 09:02:33 crc kubenswrapper[4823]: I1216 09:02:33.783279 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860c91ee-4dcd-4a5e-abcb-325b31341951" path="/var/lib/kubelet/pods/860c91ee-4dcd-4a5e-abcb-325b31341951/volumes" Dec 16 09:02:33 crc kubenswrapper[4823]: I1216 09:02:33.784839 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aabc53f0-5705-4d7f-bc7c-0b1fcfd75026" path="/var/lib/kubelet/pods/aabc53f0-5705-4d7f-bc7c-0b1fcfd75026/volumes" Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.363326 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5dadf1e-5652-4840-a8f8-985860981c4f","Type":"ContainerStarted","Data":"62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4"} Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.363379 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5dadf1e-5652-4840-a8f8-985860981c4f","Type":"ContainerStarted","Data":"d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285"} Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.363391 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5dadf1e-5652-4840-a8f8-985860981c4f","Type":"ContainerStarted","Data":"d56c1596f7d4bdcf7e3eec92e619264bfc736c9ff2444ddcb8eb5cc6cc8ddeb2"} Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.366534 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a","Type":"ContainerStarted","Data":"7ac81f3d4f6c3b985e28f223f6d2d8ddf14deedc2445d7c0e81bfa4724f713b5"} Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.366586 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a","Type":"ContainerStarted","Data":"ab5e8a8e527a7f55ae463d150528bea80ab575a8ef40cde1febdcfd7069b95a0"} Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.384339 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.384318957 podStartE2EDuration="2.384318957s" podCreationTimestamp="2025-12-16 09:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:02:34.380647222 +0000 UTC m=+7632.869213345" watchObservedRunningTime="2025-12-16 09:02:34.384318957 +0000 UTC m=+7632.872885100" Dec 16 09:02:34 crc kubenswrapper[4823]: I1216 09:02:34.402274 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.402254448 podStartE2EDuration="2.402254448s" podCreationTimestamp="2025-12-16 09:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:02:34.400744431 +0000 UTC m=+7632.889310554" watchObservedRunningTime="2025-12-16 09:02:34.402254448 +0000 UTC m=+7632.890820571" Dec 16 09:02:37 crc kubenswrapper[4823]: E1216 09:02:37.622743 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:37 crc kubenswrapper[4823]: E1216 09:02:37.624572 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:37 crc kubenswrapper[4823]: E1216 09:02:37.629262 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:37 crc kubenswrapper[4823]: E1216 09:02:37.629329 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:37 crc kubenswrapper[4823]: I1216 09:02:37.757078 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 09:02:37 crc kubenswrapper[4823]: I1216 09:02:37.757167 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 09:02:42 crc kubenswrapper[4823]: E1216 09:02:42.623261 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:42 crc kubenswrapper[4823]: E1216 09:02:42.625641 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:42 crc kubenswrapper[4823]: E1216 09:02:42.627374 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:42 crc kubenswrapper[4823]: E1216 09:02:42.627423 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:42 crc kubenswrapper[4823]: I1216 09:02:42.757302 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 09:02:42 crc kubenswrapper[4823]: I1216 09:02:42.757360 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 09:02:42 crc kubenswrapper[4823]: I1216 09:02:42.767499 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:02:42 crc kubenswrapper[4823]: I1216 09:02:42.767560 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:02:43 crc kubenswrapper[4823]: I1216 09:02:43.770229 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.100:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:43 crc kubenswrapper[4823]: I1216 09:02:43.770220 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.100:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:43 crc kubenswrapper[4823]: I1216 09:02:43.854225 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.101:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:43 crc kubenswrapper[4823]: I1216 09:02:43.854354 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.101:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:02:47 crc kubenswrapper[4823]: I1216 09:02:47.484142 4823 generic.go:334] "Generic (PLEG): container finished" podID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" exitCode=137 Dec 16 09:02:47 crc kubenswrapper[4823]: I1216 09:02:47.484461 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5dcc60af-b15a-4362-8b2a-378fbdff02e0","Type":"ContainerDied","Data":"93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea"} Dec 16 09:02:47 crc kubenswrapper[4823]: E1216 09:02:47.620996 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea is running failed: container process not found" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:47 crc kubenswrapper[4823]: E1216 09:02:47.623360 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea is running failed: container process not found" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:47 crc kubenswrapper[4823]: E1216 09:02:47.624853 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea is running failed: container process not found" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:02:47 crc kubenswrapper[4823]: E1216 09:02:47.625076 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.276147 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.294833 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-combined-ca-bundle\") pod \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.295070 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-config-data\") pod \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.295318 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4sgn\" (UniqueName: \"kubernetes.io/projected/5dcc60af-b15a-4362-8b2a-378fbdff02e0-kube-api-access-d4sgn\") pod \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\" (UID: \"5dcc60af-b15a-4362-8b2a-378fbdff02e0\") " Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.300436 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dcc60af-b15a-4362-8b2a-378fbdff02e0-kube-api-access-d4sgn" (OuterVolumeSpecName: "kube-api-access-d4sgn") pod "5dcc60af-b15a-4362-8b2a-378fbdff02e0" (UID: "5dcc60af-b15a-4362-8b2a-378fbdff02e0"). InnerVolumeSpecName "kube-api-access-d4sgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.335343 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-config-data" (OuterVolumeSpecName: "config-data") pod "5dcc60af-b15a-4362-8b2a-378fbdff02e0" (UID: "5dcc60af-b15a-4362-8b2a-378fbdff02e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.338302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dcc60af-b15a-4362-8b2a-378fbdff02e0" (UID: "5dcc60af-b15a-4362-8b2a-378fbdff02e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.397148 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.397192 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4sgn\" (UniqueName: \"kubernetes.io/projected/5dcc60af-b15a-4362-8b2a-378fbdff02e0-kube-api-access-d4sgn\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.397208 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcc60af-b15a-4362-8b2a-378fbdff02e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.494900 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5dcc60af-b15a-4362-8b2a-378fbdff02e0","Type":"ContainerDied","Data":"e4bca8c4650f70bd11608528bb6fc9e32073c57e176d666ada681d06cbe0c928"} Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.494975 4823 scope.go:117] "RemoveContainer" containerID="93433386da1b5be2eb0555f8de99c00c097f30dfa7cd1412b2ae555d684fa1ea" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.495140 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.530519 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.540644 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.564322 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:48 crc kubenswrapper[4823]: E1216 09:02:48.564808 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.564844 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.565103 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" containerName="nova-scheduler-scheduler" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.565762 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.569737 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.597459 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.600265 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.600364 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-config-data\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.600728 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxglw\" (UniqueName: \"kubernetes.io/projected/6f60cf52-47f0-4efd-8479-64bcc13848cf-kube-api-access-nxglw\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.703082 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.703239 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-config-data\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.703336 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxglw\" (UniqueName: \"kubernetes.io/projected/6f60cf52-47f0-4efd-8479-64bcc13848cf-kube-api-access-nxglw\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.710776 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-config-data\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.714946 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.723720 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxglw\" (UniqueName: \"kubernetes.io/projected/6f60cf52-47f0-4efd-8479-64bcc13848cf-kube-api-access-nxglw\") pod \"nova-scheduler-0\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " pod="openstack/nova-scheduler-0" Dec 16 09:02:48 crc kubenswrapper[4823]: I1216 09:02:48.887251 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:02:49 crc kubenswrapper[4823]: I1216 09:02:49.459849 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:02:49 crc kubenswrapper[4823]: I1216 09:02:49.504907 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f60cf52-47f0-4efd-8479-64bcc13848cf","Type":"ContainerStarted","Data":"90392d82387d45dbffa8a8d7ce481dd9c2592ac9876306fb265f6ddad2ddece8"} Dec 16 09:02:49 crc kubenswrapper[4823]: I1216 09:02:49.782273 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dcc60af-b15a-4362-8b2a-378fbdff02e0" path="/var/lib/kubelet/pods/5dcc60af-b15a-4362-8b2a-378fbdff02e0/volumes" Dec 16 09:02:50 crc kubenswrapper[4823]: I1216 09:02:50.516057 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f60cf52-47f0-4efd-8479-64bcc13848cf","Type":"ContainerStarted","Data":"417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d"} Dec 16 09:02:50 crc kubenswrapper[4823]: I1216 09:02:50.540666 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.540642749 podStartE2EDuration="2.540642749s" podCreationTimestamp="2025-12-16 09:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:02:50.533191875 +0000 UTC m=+7649.021757998" watchObservedRunningTime="2025-12-16 09:02:50.540642749 +0000 UTC m=+7649.029208872" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.761972 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.763643 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.766744 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.779963 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.780343 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.787829 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 09:02:52 crc kubenswrapper[4823]: I1216 09:02:52.790726 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.542880 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.546774 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.548156 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.736878 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df55b6677-dqvsm"] Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.738350 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.767896 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df55b6677-dqvsm"] Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.807386 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-dns-svc\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.807554 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-sb\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.807617 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmjx5\" (UniqueName: \"kubernetes.io/projected/99ce1c86-eccc-4f3c-b999-18774e823763-kube-api-access-gmjx5\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.807687 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-config\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.808640 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-nb\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.888149 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.909795 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-dns-svc\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.909852 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-sb\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.909881 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmjx5\" (UniqueName: \"kubernetes.io/projected/99ce1c86-eccc-4f3c-b999-18774e823763-kube-api-access-gmjx5\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.909904 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-config\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.909932 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-nb\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.913085 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-nb\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.913411 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-sb\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.913752 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-dns-svc\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.914154 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-config\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:53 crc kubenswrapper[4823]: I1216 09:02:53.935305 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmjx5\" (UniqueName: \"kubernetes.io/projected/99ce1c86-eccc-4f3c-b999-18774e823763-kube-api-access-gmjx5\") pod \"dnsmasq-dns-df55b6677-dqvsm\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:54 crc kubenswrapper[4823]: I1216 09:02:54.070406 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:54 crc kubenswrapper[4823]: I1216 09:02:54.558856 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df55b6677-dqvsm"] Dec 16 09:02:55 crc kubenswrapper[4823]: I1216 09:02:55.570086 4823 generic.go:334] "Generic (PLEG): container finished" podID="99ce1c86-eccc-4f3c-b999-18774e823763" containerID="e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3" exitCode=0 Dec 16 09:02:55 crc kubenswrapper[4823]: I1216 09:02:55.570402 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" event={"ID":"99ce1c86-eccc-4f3c-b999-18774e823763","Type":"ContainerDied","Data":"e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3"} Dec 16 09:02:55 crc kubenswrapper[4823]: I1216 09:02:55.570570 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" event={"ID":"99ce1c86-eccc-4f3c-b999-18774e823763","Type":"ContainerStarted","Data":"c2a7d0f644a83a4facecf3e0bca015ccbeff1cf0c10e71185053b15f2c260972"} Dec 16 09:02:56 crc kubenswrapper[4823]: I1216 09:02:56.484522 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:02:56 crc kubenswrapper[4823]: I1216 09:02:56.579704 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" event={"ID":"99ce1c86-eccc-4f3c-b999-18774e823763","Type":"ContainerStarted","Data":"ffe2b7b0714c510ef6a3fa2e5f42cfeebfd6329c0126d803409a205d66f2ddec"} Dec 16 09:02:56 crc kubenswrapper[4823]: I1216 09:02:56.579794 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-log" containerID="cri-o://d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285" gracePeriod=30 Dec 16 09:02:56 crc kubenswrapper[4823]: I1216 09:02:56.579880 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-api" containerID="cri-o://62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4" gracePeriod=30 Dec 16 09:02:56 crc kubenswrapper[4823]: I1216 09:02:56.580077 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:02:56 crc kubenswrapper[4823]: I1216 09:02:56.606188 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" podStartSLOduration=3.606169914 podStartE2EDuration="3.606169914s" podCreationTimestamp="2025-12-16 09:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:02:56.604054978 +0000 UTC m=+7655.092621111" watchObservedRunningTime="2025-12-16 09:02:56.606169914 +0000 UTC m=+7655.094736037" Dec 16 09:02:57 crc kubenswrapper[4823]: I1216 09:02:57.590357 4823 generic.go:334] "Generic (PLEG): container finished" podID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerID="d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285" exitCode=143 Dec 16 09:02:57 crc kubenswrapper[4823]: I1216 09:02:57.590450 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5dadf1e-5652-4840-a8f8-985860981c4f","Type":"ContainerDied","Data":"d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285"} Dec 16 09:02:58 crc kubenswrapper[4823]: I1216 09:02:58.890581 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 09:02:58 crc kubenswrapper[4823]: I1216 09:02:58.928118 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 09:02:59 crc kubenswrapper[4823]: I1216 09:02:59.636523 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.379708 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.534467 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5dadf1e-5652-4840-a8f8-985860981c4f-logs\") pod \"c5dadf1e-5652-4840-a8f8-985860981c4f\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.534663 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-combined-ca-bundle\") pod \"c5dadf1e-5652-4840-a8f8-985860981c4f\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.534795 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zqk\" (UniqueName: \"kubernetes.io/projected/c5dadf1e-5652-4840-a8f8-985860981c4f-kube-api-access-s7zqk\") pod \"c5dadf1e-5652-4840-a8f8-985860981c4f\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.534824 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-config-data\") pod \"c5dadf1e-5652-4840-a8f8-985860981c4f\" (UID: \"c5dadf1e-5652-4840-a8f8-985860981c4f\") " Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.534950 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5dadf1e-5652-4840-a8f8-985860981c4f-logs" (OuterVolumeSpecName: "logs") pod "c5dadf1e-5652-4840-a8f8-985860981c4f" (UID: "c5dadf1e-5652-4840-a8f8-985860981c4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.535355 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5dadf1e-5652-4840-a8f8-985860981c4f-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.545210 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5dadf1e-5652-4840-a8f8-985860981c4f-kube-api-access-s7zqk" (OuterVolumeSpecName: "kube-api-access-s7zqk") pod "c5dadf1e-5652-4840-a8f8-985860981c4f" (UID: "c5dadf1e-5652-4840-a8f8-985860981c4f"). InnerVolumeSpecName "kube-api-access-s7zqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.570870 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5dadf1e-5652-4840-a8f8-985860981c4f" (UID: "c5dadf1e-5652-4840-a8f8-985860981c4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.587270 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-config-data" (OuterVolumeSpecName: "config-data") pod "c5dadf1e-5652-4840-a8f8-985860981c4f" (UID: "c5dadf1e-5652-4840-a8f8-985860981c4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.622703 4823 generic.go:334] "Generic (PLEG): container finished" podID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerID="62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4" exitCode=0 Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.622980 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.628850 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5dadf1e-5652-4840-a8f8-985860981c4f","Type":"ContainerDied","Data":"62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4"} Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.628902 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c5dadf1e-5652-4840-a8f8-985860981c4f","Type":"ContainerDied","Data":"d56c1596f7d4bdcf7e3eec92e619264bfc736c9ff2444ddcb8eb5cc6cc8ddeb2"} Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.628923 4823 scope.go:117] "RemoveContainer" containerID="62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.637413 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zqk\" (UniqueName: \"kubernetes.io/projected/c5dadf1e-5652-4840-a8f8-985860981c4f-kube-api-access-s7zqk\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.637444 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.637457 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5dadf1e-5652-4840-a8f8-985860981c4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.663608 4823 scope.go:117] "RemoveContainer" containerID="d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.666072 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.684211 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.689924 4823 scope.go:117] "RemoveContainer" containerID="62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4" Dec 16 09:03:00 crc kubenswrapper[4823]: E1216 09:03:00.690989 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4\": container with ID starting with 62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4 not found: ID does not exist" containerID="62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.691088 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4"} err="failed to get container status \"62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4\": rpc error: code = NotFound desc = could not find container \"62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4\": container with ID starting with 62a600d54a9409c5850db7acf56f24348373fa5bb6eaccf88276ffc9c5ed3df4 not found: ID does not exist" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.691116 4823 scope.go:117] "RemoveContainer" containerID="d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285" Dec 16 09:03:00 crc kubenswrapper[4823]: E1216 09:03:00.694490 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285\": container with ID starting with d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285 not found: ID does not exist" containerID="d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.694536 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285"} err="failed to get container status \"d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285\": rpc error: code = NotFound desc = could not find container \"d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285\": container with ID starting with d1761bc3aa19e0128ff7a7ad8a21dbdb4460bc85cabdf8c98269a38af50e8285 not found: ID does not exist" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.698081 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 09:03:00 crc kubenswrapper[4823]: E1216 09:03:00.698519 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-log" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.698536 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-log" Dec 16 09:03:00 crc kubenswrapper[4823]: E1216 09:03:00.698550 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-api" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.698558 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-api" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.698735 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-api" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.698747 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" containerName="nova-api-log" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.699774 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.710747 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.727734 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.727948 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.728116 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.841663 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.841755 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a40068b-87bc-4af6-862d-ad33696041b3-logs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.841783 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-config-data\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.841802 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.841853 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jcsf\" (UniqueName: \"kubernetes.io/projected/2a40068b-87bc-4af6-862d-ad33696041b3-kube-api-access-6jcsf\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.841891 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.943833 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jcsf\" (UniqueName: \"kubernetes.io/projected/2a40068b-87bc-4af6-862d-ad33696041b3-kube-api-access-6jcsf\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.943902 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.944052 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.944117 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a40068b-87bc-4af6-862d-ad33696041b3-logs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.944140 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-config-data\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.944157 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.945349 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a40068b-87bc-4af6-862d-ad33696041b3-logs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.948480 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.948516 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-config-data\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.958685 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.961798 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:00 crc kubenswrapper[4823]: I1216 09:03:00.962254 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jcsf\" (UniqueName: \"kubernetes.io/projected/2a40068b-87bc-4af6-862d-ad33696041b3-kube-api-access-6jcsf\") pod \"nova-api-0\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " pod="openstack/nova-api-0" Dec 16 09:03:01 crc kubenswrapper[4823]: I1216 09:03:01.067372 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:03:01 crc kubenswrapper[4823]: I1216 09:03:01.585922 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:03:01 crc kubenswrapper[4823]: W1216 09:03:01.588578 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a40068b_87bc_4af6_862d_ad33696041b3.slice/crio-77efb9a2ed0b266ccfd4ab4761eb7b41df522e2be97e7a859ec7a551cd547d40 WatchSource:0}: Error finding container 77efb9a2ed0b266ccfd4ab4761eb7b41df522e2be97e7a859ec7a551cd547d40: Status 404 returned error can't find the container with id 77efb9a2ed0b266ccfd4ab4761eb7b41df522e2be97e7a859ec7a551cd547d40 Dec 16 09:03:01 crc kubenswrapper[4823]: I1216 09:03:01.636306 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a40068b-87bc-4af6-862d-ad33696041b3","Type":"ContainerStarted","Data":"77efb9a2ed0b266ccfd4ab4761eb7b41df522e2be97e7a859ec7a551cd547d40"} Dec 16 09:03:01 crc kubenswrapper[4823]: I1216 09:03:01.785217 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5dadf1e-5652-4840-a8f8-985860981c4f" path="/var/lib/kubelet/pods/c5dadf1e-5652-4840-a8f8-985860981c4f/volumes" Dec 16 09:03:02 crc kubenswrapper[4823]: I1216 09:03:02.647379 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a40068b-87bc-4af6-862d-ad33696041b3","Type":"ContainerStarted","Data":"cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973"} Dec 16 09:03:02 crc kubenswrapper[4823]: I1216 09:03:02.648862 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a40068b-87bc-4af6-862d-ad33696041b3","Type":"ContainerStarted","Data":"d0ed8363af4a48c1ad2fe42fbd2a98b00aaad8af5f9c4a1438b6a7c118165062"} Dec 16 09:03:02 crc kubenswrapper[4823]: I1216 09:03:02.682157 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.682121755 podStartE2EDuration="2.682121755s" podCreationTimestamp="2025-12-16 09:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:03:02.668264922 +0000 UTC m=+7661.156831065" watchObservedRunningTime="2025-12-16 09:03:02.682121755 +0000 UTC m=+7661.170687908" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.073203 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.143590 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-555fc99bf5-94c6p"] Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.143841 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="dnsmasq-dns" containerID="cri-o://d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03" gracePeriod=10 Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.179352 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.91:5353: connect: connection refused" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.651369 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.672724 4823 generic.go:334] "Generic (PLEG): container finished" podID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerID="d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03" exitCode=0 Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.672775 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" event={"ID":"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d","Type":"ContainerDied","Data":"d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03"} Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.672873 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" event={"ID":"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d","Type":"ContainerDied","Data":"f2a6b7c8059278d095efb06abff798aca06ec0cbdb3fc30fcf899baebf63f117"} Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.672914 4823 scope.go:117] "RemoveContainer" containerID="d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.673477 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555fc99bf5-94c6p" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.701313 4823 scope.go:117] "RemoveContainer" containerID="73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.742826 4823 scope.go:117] "RemoveContainer" containerID="d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03" Dec 16 09:03:04 crc kubenswrapper[4823]: E1216 09:03:04.743387 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03\": container with ID starting with d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03 not found: ID does not exist" containerID="d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.743419 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03"} err="failed to get container status \"d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03\": rpc error: code = NotFound desc = could not find container \"d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03\": container with ID starting with d6a67520b906cd092b8ff3036115df66ef64c7e9b56f9b1dbfdb436a3abf8c03 not found: ID does not exist" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.743440 4823 scope.go:117] "RemoveContainer" containerID="73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a" Dec 16 09:03:04 crc kubenswrapper[4823]: E1216 09:03:04.743665 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a\": container with ID starting with 73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a not found: ID does not exist" containerID="73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.743686 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a"} err="failed to get container status \"73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a\": rpc error: code = NotFound desc = could not find container \"73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a\": container with ID starting with 73400d7d563bca3c61b25393df436b90fe562fbe218d8961cdd98801e7ce799a not found: ID does not exist" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.821342 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-dns-svc\") pod \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.822470 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-sb\") pod \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.822604 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nckxs\" (UniqueName: \"kubernetes.io/projected/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-kube-api-access-nckxs\") pod \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.822731 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-config\") pod \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.823802 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-nb\") pod \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\" (UID: \"a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d\") " Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.835562 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-kube-api-access-nckxs" (OuterVolumeSpecName: "kube-api-access-nckxs") pod "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" (UID: "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d"). InnerVolumeSpecName "kube-api-access-nckxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.874098 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" (UID: "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.875247 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-config" (OuterVolumeSpecName: "config") pod "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" (UID: "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.879357 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" (UID: "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.880777 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" (UID: "a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.929663 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.929702 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nckxs\" (UniqueName: \"kubernetes.io/projected/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-kube-api-access-nckxs\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.929717 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.929732 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:04 crc kubenswrapper[4823]: I1216 09:03:04.929743 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:05 crc kubenswrapper[4823]: I1216 09:03:05.011899 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-555fc99bf5-94c6p"] Dec 16 09:03:05 crc kubenswrapper[4823]: I1216 09:03:05.022092 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-555fc99bf5-94c6p"] Dec 16 09:03:05 crc kubenswrapper[4823]: I1216 09:03:05.782696 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" path="/var/lib/kubelet/pods/a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d/volumes" Dec 16 09:03:11 crc kubenswrapper[4823]: I1216 09:03:11.068364 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:03:11 crc kubenswrapper[4823]: I1216 09:03:11.068923 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:03:12 crc kubenswrapper[4823]: I1216 09:03:12.078309 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.104:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:03:12 crc kubenswrapper[4823]: I1216 09:03:12.078352 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.104:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 09:03:21 crc kubenswrapper[4823]: I1216 09:03:21.076379 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 09:03:21 crc kubenswrapper[4823]: I1216 09:03:21.077019 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 09:03:21 crc kubenswrapper[4823]: I1216 09:03:21.077674 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:03:21 crc kubenswrapper[4823]: I1216 09:03:21.077728 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:03:21 crc kubenswrapper[4823]: I1216 09:03:21.084724 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 09:03:21 crc kubenswrapper[4823]: I1216 09:03:21.085477 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 09:03:28 crc kubenswrapper[4823]: I1216 09:03:28.134440 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:03:28 crc kubenswrapper[4823]: I1216 09:03:28.134904 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.095790 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-846b9d8c47-mwnqz"] Dec 16 09:03:33 crc kubenswrapper[4823]: E1216 09:03:33.097171 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="dnsmasq-dns" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.097193 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="dnsmasq-dns" Dec 16 09:03:33 crc kubenswrapper[4823]: E1216 09:03:33.097238 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="init" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.097248 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="init" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.097651 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56e00f5-9eaa-40ff-8e1c-4b2926e8d95d" containerName="dnsmasq-dns" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.099586 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.106410 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.106776 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-sg7gg" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.106990 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.107303 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.122872 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-846b9d8c47-mwnqz"] Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.164391 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.165000 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-log" containerID="cri-o://05d1315267b8387ccc39e7df784990aaac16fc7471cf33a43d40495a1621dc86" gracePeriod=30 Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.165679 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-httpd" containerID="cri-o://0e6e20806c08f1b0889aa8362a17ef1b5ffe809d46c93f3c54ad12f33a345cd9" gracePeriod=30 Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.199978 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-config-data\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.204446 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-scripts\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.204548 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e94041cd-e923-45c7-b6d3-b44b9266507b-horizon-secret-key\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.204709 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dw8\" (UniqueName: \"kubernetes.io/projected/e94041cd-e923-45c7-b6d3-b44b9266507b-kube-api-access-s7dw8\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.205005 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94041cd-e923-45c7-b6d3-b44b9266507b-logs\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.208194 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6656574c5c-kbgz2"] Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.210349 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.226787 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6656574c5c-kbgz2"] Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.270526 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.270788 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-log" containerID="cri-o://afeb2bc4d654f2ce24095a64b2ca9f2aa84f48dcb968b966f6c9b662c708863d" gracePeriod=30 Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.270912 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-httpd" containerID="cri-o://d795b79a5a844c87a5782ea4b792e37ae5a545d6199ec26fe830a9788c4e32cf" gracePeriod=30 Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.308451 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-scripts\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.308515 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e94041cd-e923-45c7-b6d3-b44b9266507b-horizon-secret-key\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.308553 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-config-data\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.308632 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-scripts\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.309000 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dw8\" (UniqueName: \"kubernetes.io/projected/e94041cd-e923-45c7-b6d3-b44b9266507b-kube-api-access-s7dw8\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.309420 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-scripts\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.309583 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94041cd-e923-45c7-b6d3-b44b9266507b-logs\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.309655 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70f395b1-ad4e-40f8-91ac-025773e74846-horizon-secret-key\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.309682 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksfh\" (UniqueName: \"kubernetes.io/projected/70f395b1-ad4e-40f8-91ac-025773e74846-kube-api-access-4ksfh\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.309822 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f395b1-ad4e-40f8-91ac-025773e74846-logs\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.310074 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-config-data\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.310179 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94041cd-e923-45c7-b6d3-b44b9266507b-logs\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.311394 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-config-data\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.317561 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e94041cd-e923-45c7-b6d3-b44b9266507b-horizon-secret-key\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.331170 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dw8\" (UniqueName: \"kubernetes.io/projected/e94041cd-e923-45c7-b6d3-b44b9266507b-kube-api-access-s7dw8\") pod \"horizon-846b9d8c47-mwnqz\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.412683 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70f395b1-ad4e-40f8-91ac-025773e74846-horizon-secret-key\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.412738 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksfh\" (UniqueName: \"kubernetes.io/projected/70f395b1-ad4e-40f8-91ac-025773e74846-kube-api-access-4ksfh\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.412792 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f395b1-ad4e-40f8-91ac-025773e74846-logs\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.412918 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-config-data\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.412957 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-scripts\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.413718 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-scripts\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.414272 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f395b1-ad4e-40f8-91ac-025773e74846-logs\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.414463 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-config-data\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.418631 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70f395b1-ad4e-40f8-91ac-025773e74846-horizon-secret-key\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.433482 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.436339 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksfh\" (UniqueName: \"kubernetes.io/projected/70f395b1-ad4e-40f8-91ac-025773e74846-kube-api-access-4ksfh\") pod \"horizon-6656574c5c-kbgz2\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.548977 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:33 crc kubenswrapper[4823]: I1216 09:03:33.927040 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-846b9d8c47-mwnqz"] Dec 16 09:03:34 crc kubenswrapper[4823]: I1216 09:03:34.012410 4823 generic.go:334] "Generic (PLEG): container finished" podID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerID="05d1315267b8387ccc39e7df784990aaac16fc7471cf33a43d40495a1621dc86" exitCode=143 Dec 16 09:03:34 crc kubenswrapper[4823]: I1216 09:03:34.012892 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cd6f222-5425-4965-ae37-6225b9a87af0","Type":"ContainerDied","Data":"05d1315267b8387ccc39e7df784990aaac16fc7471cf33a43d40495a1621dc86"} Dec 16 09:03:34 crc kubenswrapper[4823]: I1216 09:03:34.015338 4823 generic.go:334] "Generic (PLEG): container finished" podID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerID="afeb2bc4d654f2ce24095a64b2ca9f2aa84f48dcb968b966f6c9b662c708863d" exitCode=143 Dec 16 09:03:34 crc kubenswrapper[4823]: I1216 09:03:34.015469 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15fb6d60-bfc3-40af-b514-9cca55e1034f","Type":"ContainerDied","Data":"afeb2bc4d654f2ce24095a64b2ca9f2aa84f48dcb968b966f6c9b662c708863d"} Dec 16 09:03:34 crc kubenswrapper[4823]: I1216 09:03:34.016682 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-846b9d8c47-mwnqz" event={"ID":"e94041cd-e923-45c7-b6d3-b44b9266507b","Type":"ContainerStarted","Data":"449a88ae4b689ea26586e91c0805a09a5c51de37b97fd99b22d986b9f23e8822"} Dec 16 09:03:34 crc kubenswrapper[4823]: I1216 09:03:34.108503 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6656574c5c-kbgz2"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.028130 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6656574c5c-kbgz2" event={"ID":"70f395b1-ad4e-40f8-91ac-025773e74846","Type":"ContainerStarted","Data":"19ec408066cd146c034f3e2a10f6b59c4a37a0fa5b3fa69e00d2aca4afb572b9"} Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.310991 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6656574c5c-kbgz2"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.357463 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-597945959b-4wxf8"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.359174 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.362415 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.382808 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-597945959b-4wxf8"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.436221 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-846b9d8c47-mwnqz"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.458920 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fc95447c4-jfpp8"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.460641 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.469754 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-config-data\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.469868 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-secret-key\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.470074 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zdp\" (UniqueName: \"kubernetes.io/projected/a863b977-cf8b-4e6a-833e-2da9cf17dc24-kube-api-access-62zdp\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.470114 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-combined-ca-bundle\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.470156 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a863b977-cf8b-4e6a-833e-2da9cf17dc24-logs\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.470197 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-scripts\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.470393 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-tls-certs\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.483739 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fc95447c4-jfpp8"] Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.572615 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-combined-ca-bundle\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.572694 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-config-data\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.572720 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-secret-key\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.572921 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx2zf\" (UniqueName: \"kubernetes.io/projected/28373e9d-544d-40d4-8517-51e6718b9493-kube-api-access-bx2zf\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573034 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62zdp\" (UniqueName: \"kubernetes.io/projected/a863b977-cf8b-4e6a-833e-2da9cf17dc24-kube-api-access-62zdp\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573167 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-combined-ca-bundle\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573207 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-config-data\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573396 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a863b977-cf8b-4e6a-833e-2da9cf17dc24-logs\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573500 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28373e9d-544d-40d4-8517-51e6718b9493-logs\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573602 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-scripts\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573659 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-secret-key\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.573875 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a863b977-cf8b-4e6a-833e-2da9cf17dc24-logs\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.575011 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-config-data\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.575110 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-scripts\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.575313 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-tls-certs\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.575362 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-tls-certs\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.575393 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-scripts\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.580572 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-combined-ca-bundle\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.580945 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-secret-key\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.582139 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-tls-certs\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.592448 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zdp\" (UniqueName: \"kubernetes.io/projected/a863b977-cf8b-4e6a-833e-2da9cf17dc24-kube-api-access-62zdp\") pod \"horizon-597945959b-4wxf8\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696054 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28373e9d-544d-40d4-8517-51e6718b9493-logs\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696249 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-secret-key\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696362 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-tls-certs\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696392 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-scripts\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696488 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-combined-ca-bundle\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696546 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28373e9d-544d-40d4-8517-51e6718b9493-logs\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696809 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx2zf\" (UniqueName: \"kubernetes.io/projected/28373e9d-544d-40d4-8517-51e6718b9493-kube-api-access-bx2zf\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.696984 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-config-data\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.698716 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-scripts\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.699039 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-config-data\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.700433 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-combined-ca-bundle\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.712369 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.715478 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-tls-certs\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.716155 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-secret-key\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.723304 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx2zf\" (UniqueName: \"kubernetes.io/projected/28373e9d-544d-40d4-8517-51e6718b9493-kube-api-access-bx2zf\") pod \"horizon-5fc95447c4-jfpp8\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:35 crc kubenswrapper[4823]: I1216 09:03:35.791092 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:36 crc kubenswrapper[4823]: I1216 09:03:36.211063 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-597945959b-4wxf8"] Dec 16 09:03:36 crc kubenswrapper[4823]: W1216 09:03:36.220158 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda863b977_cf8b_4e6a_833e_2da9cf17dc24.slice/crio-1bd0efb887db41b9d444adfa3a4a3c13afb4fad1ae3df9b3b3d23f5051874d9d WatchSource:0}: Error finding container 1bd0efb887db41b9d444adfa3a4a3c13afb4fad1ae3df9b3b3d23f5051874d9d: Status 404 returned error can't find the container with id 1bd0efb887db41b9d444adfa3a4a3c13afb4fad1ae3df9b3b3d23f5051874d9d Dec 16 09:03:36 crc kubenswrapper[4823]: I1216 09:03:36.379059 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fc95447c4-jfpp8"] Dec 16 09:03:36 crc kubenswrapper[4823]: I1216 09:03:36.413350 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.69:9292/healthcheck\": read tcp 10.217.0.2:50350->10.217.1.69:9292: read: connection reset by peer" Dec 16 09:03:36 crc kubenswrapper[4823]: I1216 09:03:36.413370 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.69:9292/healthcheck\": read tcp 10.217.0.2:50354->10.217.1.69:9292: read: connection reset by peer" Dec 16 09:03:36 crc kubenswrapper[4823]: I1216 09:03:36.588372 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.68:9292/healthcheck\": read tcp 10.217.0.2:46226->10.217.1.68:9292: read: connection reset by peer" Dec 16 09:03:36 crc kubenswrapper[4823]: I1216 09:03:36.589377 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.68:9292/healthcheck\": read tcp 10.217.0.2:46222->10.217.1.68:9292: read: connection reset by peer" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.130043 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc95447c4-jfpp8" event={"ID":"28373e9d-544d-40d4-8517-51e6718b9493","Type":"ContainerStarted","Data":"25898d18ef62d6635eb51cf6f19ca53fd4dab16358f4d2518602301e735c70cd"} Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.133698 4823 generic.go:334] "Generic (PLEG): container finished" podID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerID="0e6e20806c08f1b0889aa8362a17ef1b5ffe809d46c93f3c54ad12f33a345cd9" exitCode=0 Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.133754 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cd6f222-5425-4965-ae37-6225b9a87af0","Type":"ContainerDied","Data":"0e6e20806c08f1b0889aa8362a17ef1b5ffe809d46c93f3c54ad12f33a345cd9"} Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.134976 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-597945959b-4wxf8" event={"ID":"a863b977-cf8b-4e6a-833e-2da9cf17dc24","Type":"ContainerStarted","Data":"1bd0efb887db41b9d444adfa3a4a3c13afb4fad1ae3df9b3b3d23f5051874d9d"} Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.138596 4823 generic.go:334] "Generic (PLEG): container finished" podID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerID="d795b79a5a844c87a5782ea4b792e37ae5a545d6199ec26fe830a9788c4e32cf" exitCode=0 Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.138628 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15fb6d60-bfc3-40af-b514-9cca55e1034f","Type":"ContainerDied","Data":"d795b79a5a844c87a5782ea4b792e37ae5a545d6199ec26fe830a9788c4e32cf"} Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.190263 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.288789 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-internal-tls-certs\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.288848 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfl2\" (UniqueName: \"kubernetes.io/projected/15fb6d60-bfc3-40af-b514-9cca55e1034f-kube-api-access-nwfl2\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.288886 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-combined-ca-bundle\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.288921 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-logs\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.289066 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-httpd-run\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.289108 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-scripts\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.289159 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-config-data\") pod \"15fb6d60-bfc3-40af-b514-9cca55e1034f\" (UID: \"15fb6d60-bfc3-40af-b514-9cca55e1034f\") " Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.290505 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.291385 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-logs" (OuterVolumeSpecName: "logs") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.298551 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fb6d60-bfc3-40af-b514-9cca55e1034f-kube-api-access-nwfl2" (OuterVolumeSpecName: "kube-api-access-nwfl2") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "kube-api-access-nwfl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.300301 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-scripts" (OuterVolumeSpecName: "scripts") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.362354 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.373451 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.392063 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.392114 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfl2\" (UniqueName: \"kubernetes.io/projected/15fb6d60-bfc3-40af-b514-9cca55e1034f-kube-api-access-nwfl2\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.392127 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.392136 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.392144 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fb6d60-bfc3-40af-b514-9cca55e1034f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.392155 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.409474 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-config-data" (OuterVolumeSpecName: "config-data") pod "15fb6d60-bfc3-40af-b514-9cca55e1034f" (UID: "15fb6d60-bfc3-40af-b514-9cca55e1034f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.493783 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb6d60-bfc3-40af-b514-9cca55e1034f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:37 crc kubenswrapper[4823]: I1216 09:03:37.964733 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112498 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-httpd-run\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112699 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7c7z\" (UniqueName: \"kubernetes.io/projected/7cd6f222-5425-4965-ae37-6225b9a87af0-kube-api-access-n7c7z\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112812 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-logs\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112846 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-scripts\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112889 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-combined-ca-bundle\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112907 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-config-data\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.112925 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-public-tls-certs\") pod \"7cd6f222-5425-4965-ae37-6225b9a87af0\" (UID: \"7cd6f222-5425-4965-ae37-6225b9a87af0\") " Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.113828 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-logs" (OuterVolumeSpecName: "logs") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.114505 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.126844 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-scripts" (OuterVolumeSpecName: "scripts") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.131333 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd6f222-5425-4965-ae37-6225b9a87af0-kube-api-access-n7c7z" (OuterVolumeSpecName: "kube-api-access-n7c7z") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "kube-api-access-n7c7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.148265 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.163192 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cd6f222-5425-4965-ae37-6225b9a87af0","Type":"ContainerDied","Data":"a333247072dbe13b9b4e7ca655d4b849033eddd0dc6cff8ee4e32a9579e029e6"} Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.163256 4823 scope.go:117] "RemoveContainer" containerID="0e6e20806c08f1b0889aa8362a17ef1b5ffe809d46c93f3c54ad12f33a345cd9" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.163441 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.168267 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15fb6d60-bfc3-40af-b514-9cca55e1034f","Type":"ContainerDied","Data":"44b39aa4e66009c032e963634812228f8538d38b49def7ad11beb5eac0f7efd4"} Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.168470 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.188552 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.192326 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-config-data" (OuterVolumeSpecName: "config-data") pod "7cd6f222-5425-4965-ae37-6225b9a87af0" (UID: "7cd6f222-5425-4965-ae37-6225b9a87af0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214368 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7c7z\" (UniqueName: \"kubernetes.io/projected/7cd6f222-5425-4965-ae37-6225b9a87af0-kube-api-access-n7c7z\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214704 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214715 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214725 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214734 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214744 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6f222-5425-4965-ae37-6225b9a87af0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.214772 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cd6f222-5425-4965-ae37-6225b9a87af0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.256080 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.279589 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.286712 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: E1216 09:03:38.287202 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-log" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287217 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-log" Dec 16 09:03:38 crc kubenswrapper[4823]: E1216 09:03:38.287250 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-log" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287258 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-log" Dec 16 09:03:38 crc kubenswrapper[4823]: E1216 09:03:38.287277 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-httpd" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287284 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-httpd" Dec 16 09:03:38 crc kubenswrapper[4823]: E1216 09:03:38.287299 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-httpd" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287307 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-httpd" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287500 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-log" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287523 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" containerName="glance-httpd" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287531 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-httpd" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.287543 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" containerName="glance-log" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.289388 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.295721 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.296119 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.315692 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.427958 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.428039 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.428111 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.428140 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.428357 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgh9p\" (UniqueName: \"kubernetes.io/projected/60956cfa-c484-445d-af87-52713ccf4d09-kube-api-access-tgh9p\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.428991 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.429104 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-logs\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.516339 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.531644 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.531734 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.531774 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.531841 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgh9p\" (UniqueName: \"kubernetes.io/projected/60956cfa-c484-445d-af87-52713ccf4d09-kube-api-access-tgh9p\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.531911 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.531947 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-logs\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.532026 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.532614 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.535648 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-logs\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.541711 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.542197 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.542570 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.544966 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.547862 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.563764 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.566543 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.573986 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.574082 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.576815 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgh9p\" (UniqueName: \"kubernetes.io/projected/60956cfa-c484-445d-af87-52713ccf4d09-kube-api-access-tgh9p\") pod \"glance-default-internal-api-0\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.578569 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.625322 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739077 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739121 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr68m\" (UniqueName: \"kubernetes.io/projected/d06b91f8-1fcd-40fe-b712-0549d99258c6-kube-api-access-wr68m\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739196 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739219 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739239 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739474 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-logs\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.739927 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842290 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842396 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842419 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr68m\" (UniqueName: \"kubernetes.io/projected/d06b91f8-1fcd-40fe-b712-0549d99258c6-kube-api-access-wr68m\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842468 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842489 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842514 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.842567 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-logs\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.843232 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-logs\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.844582 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.850161 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.861858 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.865689 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.867399 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr68m\" (UniqueName: \"kubernetes.io/projected/d06b91f8-1fcd-40fe-b712-0549d99258c6-kube-api-access-wr68m\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.871539 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " pod="openstack/glance-default-external-api-0" Dec 16 09:03:38 crc kubenswrapper[4823]: I1216 09:03:38.959201 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 09:03:39 crc kubenswrapper[4823]: I1216 09:03:39.786959 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fb6d60-bfc3-40af-b514-9cca55e1034f" path="/var/lib/kubelet/pods/15fb6d60-bfc3-40af-b514-9cca55e1034f/volumes" Dec 16 09:03:39 crc kubenswrapper[4823]: I1216 09:03:39.788366 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd6f222-5425-4965-ae37-6225b9a87af0" path="/var/lib/kubelet/pods/7cd6f222-5425-4965-ae37-6225b9a87af0/volumes" Dec 16 09:03:43 crc kubenswrapper[4823]: I1216 09:03:43.298639 4823 scope.go:117] "RemoveContainer" containerID="05d1315267b8387ccc39e7df784990aaac16fc7471cf33a43d40495a1621dc86" Dec 16 09:03:43 crc kubenswrapper[4823]: I1216 09:03:43.407405 4823 scope.go:117] "RemoveContainer" containerID="d795b79a5a844c87a5782ea4b792e37ae5a545d6199ec26fe830a9788c4e32cf" Dec 16 09:03:43 crc kubenswrapper[4823]: I1216 09:03:43.463352 4823 scope.go:117] "RemoveContainer" containerID="afeb2bc4d654f2ce24095a64b2ca9f2aa84f48dcb968b966f6c9b662c708863d" Dec 16 09:03:43 crc kubenswrapper[4823]: I1216 09:03:43.831255 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:03:43 crc kubenswrapper[4823]: W1216 09:03:43.837609 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd06b91f8_1fcd_40fe_b712_0549d99258c6.slice/crio-be4ced60375d9806a581283d0c18ec61e575bf49cc10b2db539a8ef146283427 WatchSource:0}: Error finding container be4ced60375d9806a581283d0c18ec61e575bf49cc10b2db539a8ef146283427: Status 404 returned error can't find the container with id be4ced60375d9806a581283d0c18ec61e575bf49cc10b2db539a8ef146283427 Dec 16 09:03:43 crc kubenswrapper[4823]: I1216 09:03:43.935301 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:03:43 crc kubenswrapper[4823]: W1216 09:03:43.944766 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60956cfa_c484_445d_af87_52713ccf4d09.slice/crio-11521121ebfde6521f17a69f42825d1def922a1e72b4028195b3d3d4fd35c7ff WatchSource:0}: Error finding container 11521121ebfde6521f17a69f42825d1def922a1e72b4028195b3d3d4fd35c7ff: Status 404 returned error can't find the container with id 11521121ebfde6521f17a69f42825d1def922a1e72b4028195b3d3d4fd35c7ff Dec 16 09:03:44 crc kubenswrapper[4823]: I1216 09:03:44.234385 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d06b91f8-1fcd-40fe-b712-0549d99258c6","Type":"ContainerStarted","Data":"be4ced60375d9806a581283d0c18ec61e575bf49cc10b2db539a8ef146283427"} Dec 16 09:03:44 crc kubenswrapper[4823]: I1216 09:03:44.236420 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-846b9d8c47-mwnqz" event={"ID":"e94041cd-e923-45c7-b6d3-b44b9266507b","Type":"ContainerStarted","Data":"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d"} Dec 16 09:03:44 crc kubenswrapper[4823]: I1216 09:03:44.239548 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60956cfa-c484-445d-af87-52713ccf4d09","Type":"ContainerStarted","Data":"11521121ebfde6521f17a69f42825d1def922a1e72b4028195b3d3d4fd35c7ff"} Dec 16 09:03:44 crc kubenswrapper[4823]: I1216 09:03:44.244809 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-597945959b-4wxf8" event={"ID":"a863b977-cf8b-4e6a-833e-2da9cf17dc24","Type":"ContainerStarted","Data":"f2812b29890ac395c2f5f98caf282d473111e60b65cb29c67bd2f7d1b3674107"} Dec 16 09:03:44 crc kubenswrapper[4823]: I1216 09:03:44.247661 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6656574c5c-kbgz2" event={"ID":"70f395b1-ad4e-40f8-91ac-025773e74846","Type":"ContainerStarted","Data":"515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587"} Dec 16 09:03:44 crc kubenswrapper[4823]: I1216 09:03:44.249215 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc95447c4-jfpp8" event={"ID":"28373e9d-544d-40d4-8517-51e6718b9493","Type":"ContainerStarted","Data":"304daf66f20219103734d58ce5cff3122507318b7f01170a8de3b259a5a31f50"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.260113 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-597945959b-4wxf8" event={"ID":"a863b977-cf8b-4e6a-833e-2da9cf17dc24","Type":"ContainerStarted","Data":"3fb367ad0a1ebccafffe7fff5169857da05007085dea8c2c6f82effda11cacfc"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.264256 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6656574c5c-kbgz2" event={"ID":"70f395b1-ad4e-40f8-91ac-025773e74846","Type":"ContainerStarted","Data":"956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.264308 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6656574c5c-kbgz2" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon-log" containerID="cri-o://515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587" gracePeriod=30 Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.264315 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6656574c5c-kbgz2" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon" containerID="cri-o://956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f" gracePeriod=30 Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.274270 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc95447c4-jfpp8" event={"ID":"28373e9d-544d-40d4-8517-51e6718b9493","Type":"ContainerStarted","Data":"ea9aa1f918811c0b1fc9ff20658bcd5bde81f67878b0d287ad886928e5de1fba"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.276186 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d06b91f8-1fcd-40fe-b712-0549d99258c6","Type":"ContainerStarted","Data":"9a93aa1b4c75390a6ef3fa58db9fd87ea0983cf80d04ef79278da7ac5a212dbe"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.278129 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-846b9d8c47-mwnqz" event={"ID":"e94041cd-e923-45c7-b6d3-b44b9266507b","Type":"ContainerStarted","Data":"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.278182 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-846b9d8c47-mwnqz" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon-log" containerID="cri-o://a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d" gracePeriod=30 Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.278201 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-846b9d8c47-mwnqz" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon" containerID="cri-o://685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8" gracePeriod=30 Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.284729 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60956cfa-c484-445d-af87-52713ccf4d09","Type":"ContainerStarted","Data":"ae7cf328f2dddbc80841007ae8ef6edc83650ff4a0d553d7b2dca17acae597a1"} Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.291671 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-597945959b-4wxf8" podStartSLOduration=3.047636654 podStartE2EDuration="10.291646233s" podCreationTimestamp="2025-12-16 09:03:35 +0000 UTC" firstStartedPulling="2025-12-16 09:03:36.22374428 +0000 UTC m=+7694.712310403" lastFinishedPulling="2025-12-16 09:03:43.467753849 +0000 UTC m=+7701.956319982" observedRunningTime="2025-12-16 09:03:45.284322923 +0000 UTC m=+7703.772889066" watchObservedRunningTime="2025-12-16 09:03:45.291646233 +0000 UTC m=+7703.780212356" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.320908 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fc95447c4-jfpp8" podStartSLOduration=3.200187367 podStartE2EDuration="10.320886917s" podCreationTimestamp="2025-12-16 09:03:35 +0000 UTC" firstStartedPulling="2025-12-16 09:03:36.389963672 +0000 UTC m=+7694.878529805" lastFinishedPulling="2025-12-16 09:03:43.510663232 +0000 UTC m=+7701.999229355" observedRunningTime="2025-12-16 09:03:45.308836831 +0000 UTC m=+7703.797402954" watchObservedRunningTime="2025-12-16 09:03:45.320886917 +0000 UTC m=+7703.809453040" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.338084 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-846b9d8c47-mwnqz" podStartSLOduration=2.808196774 podStartE2EDuration="12.338062245s" podCreationTimestamp="2025-12-16 09:03:33 +0000 UTC" firstStartedPulling="2025-12-16 09:03:33.937651071 +0000 UTC m=+7692.426217194" lastFinishedPulling="2025-12-16 09:03:43.467516542 +0000 UTC m=+7701.956082665" observedRunningTime="2025-12-16 09:03:45.328644211 +0000 UTC m=+7703.817210334" watchObservedRunningTime="2025-12-16 09:03:45.338062245 +0000 UTC m=+7703.826628368" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.359864 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6656574c5c-kbgz2" podStartSLOduration=2.943368985 podStartE2EDuration="12.359837736s" podCreationTimestamp="2025-12-16 09:03:33 +0000 UTC" firstStartedPulling="2025-12-16 09:03:34.120746502 +0000 UTC m=+7692.609312625" lastFinishedPulling="2025-12-16 09:03:43.537215253 +0000 UTC m=+7702.025781376" observedRunningTime="2025-12-16 09:03:45.346075256 +0000 UTC m=+7703.834641389" watchObservedRunningTime="2025-12-16 09:03:45.359837736 +0000 UTC m=+7703.848403859" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.712662 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.712704 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.791643 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:45 crc kubenswrapper[4823]: I1216 09:03:45.791694 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:03:46 crc kubenswrapper[4823]: I1216 09:03:46.311746 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d06b91f8-1fcd-40fe-b712-0549d99258c6","Type":"ContainerStarted","Data":"5d6cc389cc0a251a9367e2e3b78544eb67ee1e7cee3e16ec15b4a605c6de77ee"} Dec 16 09:03:46 crc kubenswrapper[4823]: I1216 09:03:46.321728 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60956cfa-c484-445d-af87-52713ccf4d09","Type":"ContainerStarted","Data":"4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d"} Dec 16 09:03:46 crc kubenswrapper[4823]: I1216 09:03:46.351581 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.351557345 podStartE2EDuration="8.351557345s" podCreationTimestamp="2025-12-16 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:03:46.340739307 +0000 UTC m=+7704.829305480" watchObservedRunningTime="2025-12-16 09:03:46.351557345 +0000 UTC m=+7704.840123478" Dec 16 09:03:46 crc kubenswrapper[4823]: I1216 09:03:46.402139 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.402108147 podStartE2EDuration="8.402108147s" podCreationTimestamp="2025-12-16 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:03:46.386912672 +0000 UTC m=+7704.875478815" watchObservedRunningTime="2025-12-16 09:03:46.402108147 +0000 UTC m=+7704.890674300" Dec 16 09:03:48 crc kubenswrapper[4823]: I1216 09:03:48.625905 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:48 crc kubenswrapper[4823]: I1216 09:03:48.626419 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:48 crc kubenswrapper[4823]: I1216 09:03:48.666836 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:48 crc kubenswrapper[4823]: I1216 09:03:48.676294 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:48 crc kubenswrapper[4823]: I1216 09:03:48.960170 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 09:03:48 crc kubenswrapper[4823]: I1216 09:03:48.960324 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 09:03:49 crc kubenswrapper[4823]: I1216 09:03:49.010654 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 09:03:49 crc kubenswrapper[4823]: I1216 09:03:49.011575 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 09:03:49 crc kubenswrapper[4823]: I1216 09:03:49.342794 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 09:03:49 crc kubenswrapper[4823]: I1216 09:03:49.342837 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 09:03:49 crc kubenswrapper[4823]: I1216 09:03:49.342848 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:49 crc kubenswrapper[4823]: I1216 09:03:49.342857 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:51 crc kubenswrapper[4823]: I1216 09:03:51.482552 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 09:03:51 crc kubenswrapper[4823]: I1216 09:03:51.511628 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:51 crc kubenswrapper[4823]: I1216 09:03:51.528249 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 09:03:52 crc kubenswrapper[4823]: I1216 09:03:52.418754 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 09:03:53 crc kubenswrapper[4823]: I1216 09:03:53.434764 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:03:53 crc kubenswrapper[4823]: I1216 09:03:53.549670 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:03:55 crc kubenswrapper[4823]: I1216 09:03:55.715764 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-597945959b-4wxf8" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.107:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8443: connect: connection refused" Dec 16 09:03:55 crc kubenswrapper[4823]: I1216 09:03:55.793391 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fc95447c4-jfpp8" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.108:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8443: connect: connection refused" Dec 16 09:03:58 crc kubenswrapper[4823]: I1216 09:03:58.133707 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:03:58 crc kubenswrapper[4823]: I1216 09:03:58.134124 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:04:02 crc kubenswrapper[4823]: I1216 09:04:02.063207 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-75d9-account-create-update-c2qxb"] Dec 16 09:04:02 crc kubenswrapper[4823]: I1216 09:04:02.073973 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-crw7r"] Dec 16 09:04:02 crc kubenswrapper[4823]: I1216 09:04:02.087618 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-75d9-account-create-update-c2qxb"] Dec 16 09:04:02 crc kubenswrapper[4823]: I1216 09:04:02.098052 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-crw7r"] Dec 16 09:04:03 crc kubenswrapper[4823]: I1216 09:04:03.786689 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e16ede9-a9c3-45f5-a5ff-beca730a92ff" path="/var/lib/kubelet/pods/1e16ede9-a9c3-45f5-a5ff-beca730a92ff/volumes" Dec 16 09:04:03 crc kubenswrapper[4823]: I1216 09:04:03.788062 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f2e0a9-8049-4cac-855c-7da22cf8c4fe" path="/var/lib/kubelet/pods/37f2e0a9-8049-4cac-855c-7da22cf8c4fe/volumes" Dec 16 09:04:07 crc kubenswrapper[4823]: I1216 09:04:07.652382 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:04:07 crc kubenswrapper[4823]: I1216 09:04:07.703348 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:04:09 crc kubenswrapper[4823]: I1216 09:04:09.439352 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:04:09 crc kubenswrapper[4823]: I1216 09:04:09.504577 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:04:09 crc kubenswrapper[4823]: I1216 09:04:09.576332 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-597945959b-4wxf8"] Dec 16 09:04:09 crc kubenswrapper[4823]: I1216 09:04:09.576622 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-597945959b-4wxf8" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon-log" containerID="cri-o://f2812b29890ac395c2f5f98caf282d473111e60b65cb29c67bd2f7d1b3674107" gracePeriod=30 Dec 16 09:04:09 crc kubenswrapper[4823]: I1216 09:04:09.576801 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-597945959b-4wxf8" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" containerID="cri-o://3fb367ad0a1ebccafffe7fff5169857da05007085dea8c2c6f82effda11cacfc" gracePeriod=30 Dec 16 09:04:13 crc kubenswrapper[4823]: I1216 09:04:13.041473 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mcsnt"] Dec 16 09:04:13 crc kubenswrapper[4823]: I1216 09:04:13.077597 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mcsnt"] Dec 16 09:04:13 crc kubenswrapper[4823]: I1216 09:04:13.588610 4823 generic.go:334] "Generic (PLEG): container finished" podID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerID="3fb367ad0a1ebccafffe7fff5169857da05007085dea8c2c6f82effda11cacfc" exitCode=0 Dec 16 09:04:13 crc kubenswrapper[4823]: I1216 09:04:13.588679 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-597945959b-4wxf8" event={"ID":"a863b977-cf8b-4e6a-833e-2da9cf17dc24","Type":"ContainerDied","Data":"3fb367ad0a1ebccafffe7fff5169857da05007085dea8c2c6f82effda11cacfc"} Dec 16 09:04:13 crc kubenswrapper[4823]: I1216 09:04:13.785689 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25b8be3-cf0e-4682-b7a2-b56f101a23e4" path="/var/lib/kubelet/pods/a25b8be3-cf0e-4682-b7a2-b56f101a23e4/volumes" Dec 16 09:04:15 crc kubenswrapper[4823]: I1216 09:04:15.612445 4823 generic.go:334] "Generic (PLEG): container finished" podID="70f395b1-ad4e-40f8-91ac-025773e74846" containerID="515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587" exitCode=137 Dec 16 09:04:15 crc kubenswrapper[4823]: I1216 09:04:15.612517 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6656574c5c-kbgz2" event={"ID":"70f395b1-ad4e-40f8-91ac-025773e74846","Type":"ContainerDied","Data":"515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587"} Dec 16 09:04:15 crc kubenswrapper[4823]: I1216 09:04:15.713694 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-597945959b-4wxf8" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.107:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8443: connect: connection refused" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.216493 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.335872 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ksfh\" (UniqueName: \"kubernetes.io/projected/70f395b1-ad4e-40f8-91ac-025773e74846-kube-api-access-4ksfh\") pod \"70f395b1-ad4e-40f8-91ac-025773e74846\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.335938 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70f395b1-ad4e-40f8-91ac-025773e74846-horizon-secret-key\") pod \"70f395b1-ad4e-40f8-91ac-025773e74846\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.335985 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f395b1-ad4e-40f8-91ac-025773e74846-logs\") pod \"70f395b1-ad4e-40f8-91ac-025773e74846\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.336133 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-scripts\") pod \"70f395b1-ad4e-40f8-91ac-025773e74846\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.336242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-config-data\") pod \"70f395b1-ad4e-40f8-91ac-025773e74846\" (UID: \"70f395b1-ad4e-40f8-91ac-025773e74846\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.336816 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f395b1-ad4e-40f8-91ac-025773e74846-logs" (OuterVolumeSpecName: "logs") pod "70f395b1-ad4e-40f8-91ac-025773e74846" (UID: "70f395b1-ad4e-40f8-91ac-025773e74846"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.337141 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f395b1-ad4e-40f8-91ac-025773e74846-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.343210 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f395b1-ad4e-40f8-91ac-025773e74846-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "70f395b1-ad4e-40f8-91ac-025773e74846" (UID: "70f395b1-ad4e-40f8-91ac-025773e74846"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.343755 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f395b1-ad4e-40f8-91ac-025773e74846-kube-api-access-4ksfh" (OuterVolumeSpecName: "kube-api-access-4ksfh") pod "70f395b1-ad4e-40f8-91ac-025773e74846" (UID: "70f395b1-ad4e-40f8-91ac-025773e74846"). InnerVolumeSpecName "kube-api-access-4ksfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.344740 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.373682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-config-data" (OuterVolumeSpecName: "config-data") pod "70f395b1-ad4e-40f8-91ac-025773e74846" (UID: "70f395b1-ad4e-40f8-91ac-025773e74846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.384255 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-scripts" (OuterVolumeSpecName: "scripts") pod "70f395b1-ad4e-40f8-91ac-025773e74846" (UID: "70f395b1-ad4e-40f8-91ac-025773e74846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.438372 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7dw8\" (UniqueName: \"kubernetes.io/projected/e94041cd-e923-45c7-b6d3-b44b9266507b-kube-api-access-s7dw8\") pod \"e94041cd-e923-45c7-b6d3-b44b9266507b\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.438676 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94041cd-e923-45c7-b6d3-b44b9266507b-logs\") pod \"e94041cd-e923-45c7-b6d3-b44b9266507b\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.438734 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-config-data\") pod \"e94041cd-e923-45c7-b6d3-b44b9266507b\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.438762 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-scripts\") pod \"e94041cd-e923-45c7-b6d3-b44b9266507b\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.438820 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e94041cd-e923-45c7-b6d3-b44b9266507b-horizon-secret-key\") pod \"e94041cd-e923-45c7-b6d3-b44b9266507b\" (UID: \"e94041cd-e923-45c7-b6d3-b44b9266507b\") " Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.439422 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94041cd-e923-45c7-b6d3-b44b9266507b-logs" (OuterVolumeSpecName: "logs") pod "e94041cd-e923-45c7-b6d3-b44b9266507b" (UID: "e94041cd-e923-45c7-b6d3-b44b9266507b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.439585 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.439612 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70f395b1-ad4e-40f8-91ac-025773e74846-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.439626 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ksfh\" (UniqueName: \"kubernetes.io/projected/70f395b1-ad4e-40f8-91ac-025773e74846-kube-api-access-4ksfh\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.439640 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70f395b1-ad4e-40f8-91ac-025773e74846-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.442955 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94041cd-e923-45c7-b6d3-b44b9266507b-kube-api-access-s7dw8" (OuterVolumeSpecName: "kube-api-access-s7dw8") pod "e94041cd-e923-45c7-b6d3-b44b9266507b" (UID: "e94041cd-e923-45c7-b6d3-b44b9266507b"). InnerVolumeSpecName "kube-api-access-s7dw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.443254 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94041cd-e923-45c7-b6d3-b44b9266507b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e94041cd-e923-45c7-b6d3-b44b9266507b" (UID: "e94041cd-e923-45c7-b6d3-b44b9266507b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.470453 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-config-data" (OuterVolumeSpecName: "config-data") pod "e94041cd-e923-45c7-b6d3-b44b9266507b" (UID: "e94041cd-e923-45c7-b6d3-b44b9266507b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.470588 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-scripts" (OuterVolumeSpecName: "scripts") pod "e94041cd-e923-45c7-b6d3-b44b9266507b" (UID: "e94041cd-e923-45c7-b6d3-b44b9266507b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.541618 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7dw8\" (UniqueName: \"kubernetes.io/projected/e94041cd-e923-45c7-b6d3-b44b9266507b-kube-api-access-s7dw8\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.541659 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94041cd-e923-45c7-b6d3-b44b9266507b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.541674 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.541684 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e94041cd-e923-45c7-b6d3-b44b9266507b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.541693 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e94041cd-e923-45c7-b6d3-b44b9266507b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626534 4823 generic.go:334] "Generic (PLEG): container finished" podID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerID="685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8" exitCode=137 Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626565 4823 generic.go:334] "Generic (PLEG): container finished" podID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerID="a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d" exitCode=137 Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626579 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-846b9d8c47-mwnqz" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626637 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-846b9d8c47-mwnqz" event={"ID":"e94041cd-e923-45c7-b6d3-b44b9266507b","Type":"ContainerDied","Data":"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8"} Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626698 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-846b9d8c47-mwnqz" event={"ID":"e94041cd-e923-45c7-b6d3-b44b9266507b","Type":"ContainerDied","Data":"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d"} Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626713 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-846b9d8c47-mwnqz" event={"ID":"e94041cd-e923-45c7-b6d3-b44b9266507b","Type":"ContainerDied","Data":"449a88ae4b689ea26586e91c0805a09a5c51de37b97fd99b22d986b9f23e8822"} Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.626733 4823 scope.go:117] "RemoveContainer" containerID="685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.629972 4823 generic.go:334] "Generic (PLEG): container finished" podID="70f395b1-ad4e-40f8-91ac-025773e74846" containerID="956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f" exitCode=137 Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.630003 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6656574c5c-kbgz2" event={"ID":"70f395b1-ad4e-40f8-91ac-025773e74846","Type":"ContainerDied","Data":"956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f"} Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.630036 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6656574c5c-kbgz2" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.630059 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6656574c5c-kbgz2" event={"ID":"70f395b1-ad4e-40f8-91ac-025773e74846","Type":"ContainerDied","Data":"19ec408066cd146c034f3e2a10f6b59c4a37a0fa5b3fa69e00d2aca4afb572b9"} Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.681908 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6656574c5c-kbgz2"] Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.690169 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6656574c5c-kbgz2"] Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.700507 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-846b9d8c47-mwnqz"] Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.709660 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-846b9d8c47-mwnqz"] Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.810394 4823 scope.go:117] "RemoveContainer" containerID="a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.856227 4823 scope.go:117] "RemoveContainer" containerID="685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8" Dec 16 09:04:16 crc kubenswrapper[4823]: E1216 09:04:16.856712 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8\": container with ID starting with 685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8 not found: ID does not exist" containerID="685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.856748 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8"} err="failed to get container status \"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8\": rpc error: code = NotFound desc = could not find container \"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8\": container with ID starting with 685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8 not found: ID does not exist" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.856770 4823 scope.go:117] "RemoveContainer" containerID="a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d" Dec 16 09:04:16 crc kubenswrapper[4823]: E1216 09:04:16.857904 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d\": container with ID starting with a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d not found: ID does not exist" containerID="a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.857939 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d"} err="failed to get container status \"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d\": rpc error: code = NotFound desc = could not find container \"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d\": container with ID starting with a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d not found: ID does not exist" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.857953 4823 scope.go:117] "RemoveContainer" containerID="685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.858218 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8"} err="failed to get container status \"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8\": rpc error: code = NotFound desc = could not find container \"685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8\": container with ID starting with 685597b4387828d19d248d61f579ab1a8a1c4ebf32badbf5a4e53463f540dac8 not found: ID does not exist" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.858245 4823 scope.go:117] "RemoveContainer" containerID="a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.858508 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d"} err="failed to get container status \"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d\": rpc error: code = NotFound desc = could not find container \"a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d\": container with ID starting with a6ae3c2e8449570c44aa8c094963244e0478c8157a0a686c49273ba56629ec5d not found: ID does not exist" Dec 16 09:04:16 crc kubenswrapper[4823]: I1216 09:04:16.858533 4823 scope.go:117] "RemoveContainer" containerID="956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.017815 4823 scope.go:117] "RemoveContainer" containerID="515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.035207 4823 scope.go:117] "RemoveContainer" containerID="956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f" Dec 16 09:04:17 crc kubenswrapper[4823]: E1216 09:04:17.035644 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f\": container with ID starting with 956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f not found: ID does not exist" containerID="956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.035680 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f"} err="failed to get container status \"956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f\": rpc error: code = NotFound desc = could not find container \"956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f\": container with ID starting with 956587a12b7f02bfc10c24526dc3cd4f6c5447c1f84ddb1b84876857d275ce2f not found: ID does not exist" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.035702 4823 scope.go:117] "RemoveContainer" containerID="515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587" Dec 16 09:04:17 crc kubenswrapper[4823]: E1216 09:04:17.035940 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587\": container with ID starting with 515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587 not found: ID does not exist" containerID="515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.035966 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587"} err="failed to get container status \"515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587\": rpc error: code = NotFound desc = could not find container \"515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587\": container with ID starting with 515b7dae94576581dc6b4fa8a87ba11a3e0269fe830f102f212d2e658fb47587 not found: ID does not exist" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.786672 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" path="/var/lib/kubelet/pods/70f395b1-ad4e-40f8-91ac-025773e74846/volumes" Dec 16 09:04:17 crc kubenswrapper[4823]: I1216 09:04:17.788383 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" path="/var/lib/kubelet/pods/e94041cd-e923-45c7-b6d3-b44b9266507b/volumes" Dec 16 09:04:25 crc kubenswrapper[4823]: I1216 09:04:25.713302 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-597945959b-4wxf8" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.107:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8443: connect: connection refused" Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.133728 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.134148 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.134215 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.135427 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.135534 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" gracePeriod=600 Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.761532 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" exitCode=0 Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.761587 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d"} Dec 16 09:04:28 crc kubenswrapper[4823]: I1216 09:04:28.761676 4823 scope.go:117] "RemoveContainer" containerID="c4d9ea4299c018a902750aabeef9dea06ce13b6e55f03c5913f1f492b4b19163" Dec 16 09:04:28 crc kubenswrapper[4823]: E1216 09:04:28.956487 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:04:29 crc kubenswrapper[4823]: I1216 09:04:29.775873 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:04:29 crc kubenswrapper[4823]: E1216 09:04:29.776918 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:04:35 crc kubenswrapper[4823]: I1216 09:04:35.714475 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-597945959b-4wxf8" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.107:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8443: connect: connection refused" Dec 16 09:04:35 crc kubenswrapper[4823]: I1216 09:04:35.715459 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:04:39 crc kubenswrapper[4823]: I1216 09:04:39.870612 4823 generic.go:334] "Generic (PLEG): container finished" podID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerID="f2812b29890ac395c2f5f98caf282d473111e60b65cb29c67bd2f7d1b3674107" exitCode=137 Dec 16 09:04:39 crc kubenswrapper[4823]: I1216 09:04:39.870681 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-597945959b-4wxf8" event={"ID":"a863b977-cf8b-4e6a-833e-2da9cf17dc24","Type":"ContainerDied","Data":"f2812b29890ac395c2f5f98caf282d473111e60b65cb29c67bd2f7d1b3674107"} Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.106149 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164406 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-tls-certs\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164487 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a863b977-cf8b-4e6a-833e-2da9cf17dc24-logs\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164522 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-secret-key\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164548 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-combined-ca-bundle\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164583 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-config-data\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164749 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-scripts\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.164846 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62zdp\" (UniqueName: \"kubernetes.io/projected/a863b977-cf8b-4e6a-833e-2da9cf17dc24-kube-api-access-62zdp\") pod \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\" (UID: \"a863b977-cf8b-4e6a-833e-2da9cf17dc24\") " Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.165558 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a863b977-cf8b-4e6a-833e-2da9cf17dc24-logs" (OuterVolumeSpecName: "logs") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.171009 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a863b977-cf8b-4e6a-833e-2da9cf17dc24-kube-api-access-62zdp" (OuterVolumeSpecName: "kube-api-access-62zdp") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "kube-api-access-62zdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.171073 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.189800 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-scripts" (OuterVolumeSpecName: "scripts") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.193557 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-config-data" (OuterVolumeSpecName: "config-data") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.199913 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.210106 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a863b977-cf8b-4e6a-833e-2da9cf17dc24" (UID: "a863b977-cf8b-4e6a-833e-2da9cf17dc24"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268061 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268115 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62zdp\" (UniqueName: \"kubernetes.io/projected/a863b977-cf8b-4e6a-833e-2da9cf17dc24-kube-api-access-62zdp\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268135 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268145 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a863b977-cf8b-4e6a-833e-2da9cf17dc24-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268153 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268161 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a863b977-cf8b-4e6a-833e-2da9cf17dc24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.268169 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a863b977-cf8b-4e6a-833e-2da9cf17dc24-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.884439 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-597945959b-4wxf8" event={"ID":"a863b977-cf8b-4e6a-833e-2da9cf17dc24","Type":"ContainerDied","Data":"1bd0efb887db41b9d444adfa3a4a3c13afb4fad1ae3df9b3b3d23f5051874d9d"} Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.884536 4823 scope.go:117] "RemoveContainer" containerID="3fb367ad0a1ebccafffe7fff5169857da05007085dea8c2c6f82effda11cacfc" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.884719 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-597945959b-4wxf8" Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.933893 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-597945959b-4wxf8"] Dec 16 09:04:40 crc kubenswrapper[4823]: I1216 09:04:40.942132 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-597945959b-4wxf8"] Dec 16 09:04:41 crc kubenswrapper[4823]: I1216 09:04:41.084731 4823 scope.go:117] "RemoveContainer" containerID="f2812b29890ac395c2f5f98caf282d473111e60b65cb29c67bd2f7d1b3674107" Dec 16 09:04:41 crc kubenswrapper[4823]: I1216 09:04:41.782660 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" path="/var/lib/kubelet/pods/a863b977-cf8b-4e6a-833e-2da9cf17dc24/volumes" Dec 16 09:04:42 crc kubenswrapper[4823]: I1216 09:04:42.772557 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:04:42 crc kubenswrapper[4823]: E1216 09:04:42.773260 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354177 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5948ddcb4-f5qgv"] Dec 16 09:04:50 crc kubenswrapper[4823]: E1216 09:04:50.354853 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354871 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: E1216 09:04:50.354898 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354904 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: E1216 09:04:50.354915 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354921 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: E1216 09:04:50.354938 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354943 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: E1216 09:04:50.354957 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354962 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: E1216 09:04:50.354968 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.354974 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.355146 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.355160 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.355170 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f395b1-ad4e-40f8-91ac-025773e74846" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.355182 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94041cd-e923-45c7-b6d3-b44b9266507b" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.355189 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.355203 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a863b977-cf8b-4e6a-833e-2da9cf17dc24" containerName="horizon-log" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.356138 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.367150 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5948ddcb4-f5qgv"] Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.466436 4823 scope.go:117] "RemoveContainer" containerID="eb1881f55087cc05d1c5fc74b1ae9ab6d50cdac028c0ed81d2e80a4cb91eed7a" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.497192 4823 scope.go:117] "RemoveContainer" containerID="2f33ecc1c3af33c27544999c9f3531ef568874dfdbae7c32fb60eec269e16f5a" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499412 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-tls-certs\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499528 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d650b48-8848-4495-9b48-fdf7472cc19e-logs\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499593 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-scripts\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499670 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-combined-ca-bundle\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499708 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlrc2\" (UniqueName: \"kubernetes.io/projected/6d650b48-8848-4495-9b48-fdf7472cc19e-kube-api-access-zlrc2\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499761 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-secret-key\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.499817 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-config-data\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.535345 4823 scope.go:117] "RemoveContainer" containerID="03172229973ebfa1778a194ddde03e0e3348d5dba5581e8e85812005e9dbde8a" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602263 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-tls-certs\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602367 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d650b48-8848-4495-9b48-fdf7472cc19e-logs\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602422 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-scripts\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602472 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-combined-ca-bundle\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602498 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlrc2\" (UniqueName: \"kubernetes.io/projected/6d650b48-8848-4495-9b48-fdf7472cc19e-kube-api-access-zlrc2\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602541 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-secret-key\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.602575 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-config-data\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.603521 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d650b48-8848-4495-9b48-fdf7472cc19e-logs\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.605024 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-scripts\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.605484 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-config-data\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.613754 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-tls-certs\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.615598 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-secret-key\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.617253 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-combined-ca-bundle\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.623199 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlrc2\" (UniqueName: \"kubernetes.io/projected/6d650b48-8848-4495-9b48-fdf7472cc19e-kube-api-access-zlrc2\") pod \"horizon-5948ddcb4-f5qgv\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:50 crc kubenswrapper[4823]: I1216 09:04:50.683721 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.037501 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c394-account-create-update-2chx5"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.045893 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6pxbt"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.054434 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c394-account-create-update-2chx5"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.062049 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6pxbt"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.161678 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5948ddcb4-f5qgv"] Dec 16 09:04:51 crc kubenswrapper[4823]: W1216 09:04:51.166179 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d650b48_8848_4495_9b48_fdf7472cc19e.slice/crio-036926b2eef0c1d25968fb54eb6d3a5e3bcd7cd8a742b903f4e70af78fcfec2f WatchSource:0}: Error finding container 036926b2eef0c1d25968fb54eb6d3a5e3bcd7cd8a742b903f4e70af78fcfec2f: Status 404 returned error can't find the container with id 036926b2eef0c1d25968fb54eb6d3a5e3bcd7cd8a742b903f4e70af78fcfec2f Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.586881 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-q8qt2"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.588817 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.603746 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-q8qt2"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.682824 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-f1cb-account-create-update-ppr8d"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.684343 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.687986 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.693958 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-f1cb-account-create-update-ppr8d"] Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.729742 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfm2\" (UniqueName: \"kubernetes.io/projected/04351c7d-aa0c-480c-8aba-86825423a27f-kube-api-access-ddfm2\") pod \"heat-db-create-q8qt2\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.729834 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04351c7d-aa0c-480c-8aba-86825423a27f-operator-scripts\") pod \"heat-db-create-q8qt2\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.783998 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0cc5c4-4848-45e7-80b2-1f8bc385064b" path="/var/lib/kubelet/pods/8a0cc5c4-4848-45e7-80b2-1f8bc385064b/volumes" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.784585 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ee938f-c401-44b5-aaa8-40654d2217ea" path="/var/lib/kubelet/pods/b2ee938f-c401-44b5-aaa8-40654d2217ea/volumes" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.831238 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfm2\" (UniqueName: \"kubernetes.io/projected/04351c7d-aa0c-480c-8aba-86825423a27f-kube-api-access-ddfm2\") pod \"heat-db-create-q8qt2\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.831568 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wbj5\" (UniqueName: \"kubernetes.io/projected/76c19921-64a0-4b2b-ad81-bac464f2f54a-kube-api-access-2wbj5\") pod \"heat-f1cb-account-create-update-ppr8d\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.831677 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04351c7d-aa0c-480c-8aba-86825423a27f-operator-scripts\") pod \"heat-db-create-q8qt2\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.831776 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c19921-64a0-4b2b-ad81-bac464f2f54a-operator-scripts\") pod \"heat-f1cb-account-create-update-ppr8d\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.832427 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04351c7d-aa0c-480c-8aba-86825423a27f-operator-scripts\") pod \"heat-db-create-q8qt2\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.866740 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfm2\" (UniqueName: \"kubernetes.io/projected/04351c7d-aa0c-480c-8aba-86825423a27f-kube-api-access-ddfm2\") pod \"heat-db-create-q8qt2\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.911421 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.933346 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wbj5\" (UniqueName: \"kubernetes.io/projected/76c19921-64a0-4b2b-ad81-bac464f2f54a-kube-api-access-2wbj5\") pod \"heat-f1cb-account-create-update-ppr8d\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.933460 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c19921-64a0-4b2b-ad81-bac464f2f54a-operator-scripts\") pod \"heat-f1cb-account-create-update-ppr8d\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.934414 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c19921-64a0-4b2b-ad81-bac464f2f54a-operator-scripts\") pod \"heat-f1cb-account-create-update-ppr8d\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:51 crc kubenswrapper[4823]: I1216 09:04:51.951847 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wbj5\" (UniqueName: \"kubernetes.io/projected/76c19921-64a0-4b2b-ad81-bac464f2f54a-kube-api-access-2wbj5\") pod \"heat-f1cb-account-create-update-ppr8d\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.003508 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.007872 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5948ddcb4-f5qgv" event={"ID":"6d650b48-8848-4495-9b48-fdf7472cc19e","Type":"ContainerStarted","Data":"609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c"} Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.007926 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5948ddcb4-f5qgv" event={"ID":"6d650b48-8848-4495-9b48-fdf7472cc19e","Type":"ContainerStarted","Data":"946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52"} Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.007938 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5948ddcb4-f5qgv" event={"ID":"6d650b48-8848-4495-9b48-fdf7472cc19e","Type":"ContainerStarted","Data":"036926b2eef0c1d25968fb54eb6d3a5e3bcd7cd8a742b903f4e70af78fcfec2f"} Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.032228 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5948ddcb4-f5qgv" podStartSLOduration=2.03220239 podStartE2EDuration="2.03220239s" podCreationTimestamp="2025-12-16 09:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:04:52.027792322 +0000 UTC m=+7770.516358445" watchObservedRunningTime="2025-12-16 09:04:52.03220239 +0000 UTC m=+7770.520768513" Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.396228 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-q8qt2"] Dec 16 09:04:52 crc kubenswrapper[4823]: W1216 09:04:52.399147 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04351c7d_aa0c_480c_8aba_86825423a27f.slice/crio-9ace75f66650d740625915e02435736abe9ad5df20966cd7d5ae64cb27707f0f WatchSource:0}: Error finding container 9ace75f66650d740625915e02435736abe9ad5df20966cd7d5ae64cb27707f0f: Status 404 returned error can't find the container with id 9ace75f66650d740625915e02435736abe9ad5df20966cd7d5ae64cb27707f0f Dec 16 09:04:52 crc kubenswrapper[4823]: I1216 09:04:52.541692 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-f1cb-account-create-update-ppr8d"] Dec 16 09:04:53 crc kubenswrapper[4823]: I1216 09:04:53.018937 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q8qt2" event={"ID":"04351c7d-aa0c-480c-8aba-86825423a27f","Type":"ContainerStarted","Data":"d221cc2bc27f5d1770e4a4ef7820239cba5fdb3b7ce7ba7a7f1241ea613caf68"} Dec 16 09:04:53 crc kubenswrapper[4823]: I1216 09:04:53.019006 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q8qt2" event={"ID":"04351c7d-aa0c-480c-8aba-86825423a27f","Type":"ContainerStarted","Data":"9ace75f66650d740625915e02435736abe9ad5df20966cd7d5ae64cb27707f0f"} Dec 16 09:04:53 crc kubenswrapper[4823]: I1216 09:04:53.022337 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f1cb-account-create-update-ppr8d" event={"ID":"76c19921-64a0-4b2b-ad81-bac464f2f54a","Type":"ContainerStarted","Data":"dfe482af5241a8d7cd82765bd505dc004d8d92cf3d15a4ae3901d385f9eebdab"} Dec 16 09:04:53 crc kubenswrapper[4823]: I1216 09:04:53.041512 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-q8qt2" podStartSLOduration=2.041489909 podStartE2EDuration="2.041489909s" podCreationTimestamp="2025-12-16 09:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:04:53.035311165 +0000 UTC m=+7771.523877288" watchObservedRunningTime="2025-12-16 09:04:53.041489909 +0000 UTC m=+7771.530056052" Dec 16 09:04:54 crc kubenswrapper[4823]: I1216 09:04:54.035507 4823 generic.go:334] "Generic (PLEG): container finished" podID="04351c7d-aa0c-480c-8aba-86825423a27f" containerID="d221cc2bc27f5d1770e4a4ef7820239cba5fdb3b7ce7ba7a7f1241ea613caf68" exitCode=0 Dec 16 09:04:54 crc kubenswrapper[4823]: I1216 09:04:54.035845 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q8qt2" event={"ID":"04351c7d-aa0c-480c-8aba-86825423a27f","Type":"ContainerDied","Data":"d221cc2bc27f5d1770e4a4ef7820239cba5fdb3b7ce7ba7a7f1241ea613caf68"} Dec 16 09:04:54 crc kubenswrapper[4823]: I1216 09:04:54.040368 4823 generic.go:334] "Generic (PLEG): container finished" podID="76c19921-64a0-4b2b-ad81-bac464f2f54a" containerID="ba1b7898d20fd45107f404b20c4708776c2d7d02569bb8372b7f1258b422904d" exitCode=0 Dec 16 09:04:54 crc kubenswrapper[4823]: I1216 09:04:54.040441 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f1cb-account-create-update-ppr8d" event={"ID":"76c19921-64a0-4b2b-ad81-bac464f2f54a","Type":"ContainerDied","Data":"ba1b7898d20fd45107f404b20c4708776c2d7d02569bb8372b7f1258b422904d"} Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.431788 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.438099 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.534183 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wbj5\" (UniqueName: \"kubernetes.io/projected/76c19921-64a0-4b2b-ad81-bac464f2f54a-kube-api-access-2wbj5\") pod \"76c19921-64a0-4b2b-ad81-bac464f2f54a\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.534275 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c19921-64a0-4b2b-ad81-bac464f2f54a-operator-scripts\") pod \"76c19921-64a0-4b2b-ad81-bac464f2f54a\" (UID: \"76c19921-64a0-4b2b-ad81-bac464f2f54a\") " Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.535124 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c19921-64a0-4b2b-ad81-bac464f2f54a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76c19921-64a0-4b2b-ad81-bac464f2f54a" (UID: "76c19921-64a0-4b2b-ad81-bac464f2f54a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.540273 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c19921-64a0-4b2b-ad81-bac464f2f54a-kube-api-access-2wbj5" (OuterVolumeSpecName: "kube-api-access-2wbj5") pod "76c19921-64a0-4b2b-ad81-bac464f2f54a" (UID: "76c19921-64a0-4b2b-ad81-bac464f2f54a"). InnerVolumeSpecName "kube-api-access-2wbj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.635706 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfm2\" (UniqueName: \"kubernetes.io/projected/04351c7d-aa0c-480c-8aba-86825423a27f-kube-api-access-ddfm2\") pod \"04351c7d-aa0c-480c-8aba-86825423a27f\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.635999 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04351c7d-aa0c-480c-8aba-86825423a27f-operator-scripts\") pod \"04351c7d-aa0c-480c-8aba-86825423a27f\" (UID: \"04351c7d-aa0c-480c-8aba-86825423a27f\") " Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.636394 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c19921-64a0-4b2b-ad81-bac464f2f54a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.636411 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wbj5\" (UniqueName: \"kubernetes.io/projected/76c19921-64a0-4b2b-ad81-bac464f2f54a-kube-api-access-2wbj5\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.636744 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04351c7d-aa0c-480c-8aba-86825423a27f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04351c7d-aa0c-480c-8aba-86825423a27f" (UID: "04351c7d-aa0c-480c-8aba-86825423a27f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.641196 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04351c7d-aa0c-480c-8aba-86825423a27f-kube-api-access-ddfm2" (OuterVolumeSpecName: "kube-api-access-ddfm2") pod "04351c7d-aa0c-480c-8aba-86825423a27f" (UID: "04351c7d-aa0c-480c-8aba-86825423a27f"). InnerVolumeSpecName "kube-api-access-ddfm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.738435 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04351c7d-aa0c-480c-8aba-86825423a27f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.738468 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddfm2\" (UniqueName: \"kubernetes.io/projected/04351c7d-aa0c-480c-8aba-86825423a27f-kube-api-access-ddfm2\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:55 crc kubenswrapper[4823]: I1216 09:04:55.771862 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:04:55 crc kubenswrapper[4823]: E1216 09:04:55.772313 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.064344 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f1cb-account-create-update-ppr8d" event={"ID":"76c19921-64a0-4b2b-ad81-bac464f2f54a","Type":"ContainerDied","Data":"dfe482af5241a8d7cd82765bd505dc004d8d92cf3d15a4ae3901d385f9eebdab"} Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.064599 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfe482af5241a8d7cd82765bd505dc004d8d92cf3d15a4ae3901d385f9eebdab" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.064382 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f1cb-account-create-update-ppr8d" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.067940 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q8qt2" event={"ID":"04351c7d-aa0c-480c-8aba-86825423a27f","Type":"ContainerDied","Data":"9ace75f66650d740625915e02435736abe9ad5df20966cd7d5ae64cb27707f0f"} Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.067988 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ace75f66650d740625915e02435736abe9ad5df20966cd7d5ae64cb27707f0f" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.068104 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q8qt2" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.804392 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-f9bwg"] Dec 16 09:04:56 crc kubenswrapper[4823]: E1216 09:04:56.805673 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c19921-64a0-4b2b-ad81-bac464f2f54a" containerName="mariadb-account-create-update" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.805701 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c19921-64a0-4b2b-ad81-bac464f2f54a" containerName="mariadb-account-create-update" Dec 16 09:04:56 crc kubenswrapper[4823]: E1216 09:04:56.805746 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04351c7d-aa0c-480c-8aba-86825423a27f" containerName="mariadb-database-create" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.805756 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="04351c7d-aa0c-480c-8aba-86825423a27f" containerName="mariadb-database-create" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.806056 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c19921-64a0-4b2b-ad81-bac464f2f54a" containerName="mariadb-account-create-update" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.806086 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="04351c7d-aa0c-480c-8aba-86825423a27f" containerName="mariadb-database-create" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.807351 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.815867 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f9bwg"] Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.818279 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-s74vf" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.818460 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.958867 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-config-data\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.958934 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-combined-ca-bundle\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:56 crc kubenswrapper[4823]: I1216 09:04:56.958959 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4w2s\" (UniqueName: \"kubernetes.io/projected/02583141-ec44-4216-b06a-43b990053509-kube-api-access-n4w2s\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.061202 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-config-data\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.061547 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-combined-ca-bundle\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.061662 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4w2s\" (UniqueName: \"kubernetes.io/projected/02583141-ec44-4216-b06a-43b990053509-kube-api-access-n4w2s\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.065749 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-combined-ca-bundle\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.074356 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-config-data\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.078874 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4w2s\" (UniqueName: \"kubernetes.io/projected/02583141-ec44-4216-b06a-43b990053509-kube-api-access-n4w2s\") pod \"heat-db-sync-f9bwg\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.139348 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9bwg" Dec 16 09:04:57 crc kubenswrapper[4823]: I1216 09:04:57.601216 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-f9bwg"] Dec 16 09:04:57 crc kubenswrapper[4823]: W1216 09:04:57.603931 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02583141_ec44_4216_b06a_43b990053509.slice/crio-424159d99101ba67a3a9a57b8701bf53863cc3ef31688093ff04919213adaeaf WatchSource:0}: Error finding container 424159d99101ba67a3a9a57b8701bf53863cc3ef31688093ff04919213adaeaf: Status 404 returned error can't find the container with id 424159d99101ba67a3a9a57b8701bf53863cc3ef31688093ff04919213adaeaf Dec 16 09:04:58 crc kubenswrapper[4823]: I1216 09:04:58.090689 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9bwg" event={"ID":"02583141-ec44-4216-b06a-43b990053509","Type":"ContainerStarted","Data":"424159d99101ba67a3a9a57b8701bf53863cc3ef31688093ff04919213adaeaf"} Dec 16 09:05:00 crc kubenswrapper[4823]: I1216 09:05:00.048557 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8wd5f"] Dec 16 09:05:00 crc kubenswrapper[4823]: I1216 09:05:00.070364 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8wd5f"] Dec 16 09:05:00 crc kubenswrapper[4823]: I1216 09:05:00.684164 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:05:00 crc kubenswrapper[4823]: I1216 09:05:00.685336 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:05:01 crc kubenswrapper[4823]: I1216 09:05:01.793209 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891280f8-2910-413c-aab0-818e6ee7cc7c" path="/var/lib/kubelet/pods/891280f8-2910-413c-aab0-818e6ee7cc7c/volumes" Dec 16 09:05:06 crc kubenswrapper[4823]: I1216 09:05:06.772556 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:05:06 crc kubenswrapper[4823]: E1216 09:05:06.773227 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:05:07 crc kubenswrapper[4823]: I1216 09:05:07.192283 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9bwg" event={"ID":"02583141-ec44-4216-b06a-43b990053509","Type":"ContainerStarted","Data":"00c32270da7e5bc5abc76b1cc8b234f94d52f813f0d0567012796ddcc39edf32"} Dec 16 09:05:07 crc kubenswrapper[4823]: I1216 09:05:07.227904 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-f9bwg" podStartSLOduration=2.8367526549999997 podStartE2EDuration="11.227880866s" podCreationTimestamp="2025-12-16 09:04:56 +0000 UTC" firstStartedPulling="2025-12-16 09:04:57.606198352 +0000 UTC m=+7776.094764465" lastFinishedPulling="2025-12-16 09:05:05.997326553 +0000 UTC m=+7784.485892676" observedRunningTime="2025-12-16 09:05:07.222553199 +0000 UTC m=+7785.711119332" watchObservedRunningTime="2025-12-16 09:05:07.227880866 +0000 UTC m=+7785.716446989" Dec 16 09:05:09 crc kubenswrapper[4823]: I1216 09:05:09.218548 4823 generic.go:334] "Generic (PLEG): container finished" podID="02583141-ec44-4216-b06a-43b990053509" containerID="00c32270da7e5bc5abc76b1cc8b234f94d52f813f0d0567012796ddcc39edf32" exitCode=0 Dec 16 09:05:09 crc kubenswrapper[4823]: I1216 09:05:09.218651 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9bwg" event={"ID":"02583141-ec44-4216-b06a-43b990053509","Type":"ContainerDied","Data":"00c32270da7e5bc5abc76b1cc8b234f94d52f813f0d0567012796ddcc39edf32"} Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.570591 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9bwg" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.663690 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4w2s\" (UniqueName: \"kubernetes.io/projected/02583141-ec44-4216-b06a-43b990053509-kube-api-access-n4w2s\") pod \"02583141-ec44-4216-b06a-43b990053509\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.663750 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-config-data\") pod \"02583141-ec44-4216-b06a-43b990053509\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.663840 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-combined-ca-bundle\") pod \"02583141-ec44-4216-b06a-43b990053509\" (UID: \"02583141-ec44-4216-b06a-43b990053509\") " Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.670681 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02583141-ec44-4216-b06a-43b990053509-kube-api-access-n4w2s" (OuterVolumeSpecName: "kube-api-access-n4w2s") pod "02583141-ec44-4216-b06a-43b990053509" (UID: "02583141-ec44-4216-b06a-43b990053509"). InnerVolumeSpecName "kube-api-access-n4w2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.686102 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5948ddcb4-f5qgv" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.111:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8443: connect: connection refused" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.703053 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02583141-ec44-4216-b06a-43b990053509" (UID: "02583141-ec44-4216-b06a-43b990053509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.762338 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-config-data" (OuterVolumeSpecName: "config-data") pod "02583141-ec44-4216-b06a-43b990053509" (UID: "02583141-ec44-4216-b06a-43b990053509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.766579 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4w2s\" (UniqueName: \"kubernetes.io/projected/02583141-ec44-4216-b06a-43b990053509-kube-api-access-n4w2s\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.766850 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:10 crc kubenswrapper[4823]: I1216 09:05:10.766927 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02583141-ec44-4216-b06a-43b990053509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:11 crc kubenswrapper[4823]: I1216 09:05:11.241442 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-f9bwg" event={"ID":"02583141-ec44-4216-b06a-43b990053509","Type":"ContainerDied","Data":"424159d99101ba67a3a9a57b8701bf53863cc3ef31688093ff04919213adaeaf"} Dec 16 09:05:11 crc kubenswrapper[4823]: I1216 09:05:11.241496 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="424159d99101ba67a3a9a57b8701bf53863cc3ef31688093ff04919213adaeaf" Dec 16 09:05:11 crc kubenswrapper[4823]: I1216 09:05:11.241568 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-f9bwg" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.643709 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7b85d8c859-rzzxk"] Dec 16 09:05:12 crc kubenswrapper[4823]: E1216 09:05:12.644243 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02583141-ec44-4216-b06a-43b990053509" containerName="heat-db-sync" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.644259 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="02583141-ec44-4216-b06a-43b990053509" containerName="heat-db-sync" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.644530 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="02583141-ec44-4216-b06a-43b990053509" containerName="heat-db-sync" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.645319 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.659096 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.659081 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-s74vf" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.666049 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.685514 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b85d8c859-rzzxk"] Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.714535 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr87p\" (UniqueName: \"kubernetes.io/projected/7c5e1025-1368-4594-88d1-16ef8ccadccb-kube-api-access-qr87p\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.714717 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.714870 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data-custom\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.715011 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-combined-ca-bundle\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.829778 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr87p\" (UniqueName: \"kubernetes.io/projected/7c5e1025-1368-4594-88d1-16ef8ccadccb-kube-api-access-qr87p\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.829925 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.830007 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data-custom\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.830636 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-combined-ca-bundle\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.852811 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr87p\" (UniqueName: \"kubernetes.io/projected/7c5e1025-1368-4594-88d1-16ef8ccadccb-kube-api-access-qr87p\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.856992 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-755687f5b7-sp4tf"] Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.857861 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data-custom\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.864527 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.867476 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.874697 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-combined-ca-bundle\") pod \"heat-engine-7b85d8c859-rzzxk\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.886006 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-755687f5b7-sp4tf"] Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.899388 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.916553 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-cbf489c5c-kdft6"] Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.918601 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.931700 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.932791 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.932847 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data-custom\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.932879 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gpgb\" (UniqueName: \"kubernetes.io/projected/1b3ac823-aaa1-4209-8e6b-88350ffd7519-kube-api-access-7gpgb\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.932967 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-combined-ca-bundle\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.933175 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-cbf489c5c-kdft6"] Dec 16 09:05:12 crc kubenswrapper[4823]: I1216 09:05:12.979131 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.049322 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data-custom\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.049625 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-combined-ca-bundle\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.049666 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78ls\" (UniqueName: \"kubernetes.io/projected/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-kube-api-access-g78ls\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.049784 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-combined-ca-bundle\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.050120 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.050215 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data-custom\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.050241 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.050287 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gpgb\" (UniqueName: \"kubernetes.io/projected/1b3ac823-aaa1-4209-8e6b-88350ffd7519-kube-api-access-7gpgb\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.056213 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data-custom\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.056679 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-combined-ca-bundle\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.068583 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.081068 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gpgb\" (UniqueName: \"kubernetes.io/projected/1b3ac823-aaa1-4209-8e6b-88350ffd7519-kube-api-access-7gpgb\") pod \"heat-api-755687f5b7-sp4tf\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.155000 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-combined-ca-bundle\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.155161 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.155223 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data-custom\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.155264 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78ls\" (UniqueName: \"kubernetes.io/projected/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-kube-api-access-g78ls\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.160371 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data-custom\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.160966 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-combined-ca-bundle\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.161673 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:13 crc kubenswrapper[4823]: I1216 09:05:13.177839 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78ls\" (UniqueName: \"kubernetes.io/projected/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-kube-api-access-g78ls\") pod \"heat-cfnapi-cbf489c5c-kdft6\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:13.265016 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:13.274479 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:13.506519 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b85d8c859-rzzxk"] Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:14.270557 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b85d8c859-rzzxk" event={"ID":"7c5e1025-1368-4594-88d1-16ef8ccadccb","Type":"ContainerStarted","Data":"615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b"} Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:14.270926 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b85d8c859-rzzxk" event={"ID":"7c5e1025-1368-4594-88d1-16ef8ccadccb","Type":"ContainerStarted","Data":"efe37f699fa57f7695daa66dccba36b2bae74a08e7bc6dc19a25665a085055d3"} Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:14.272155 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:14.292748 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7b85d8c859-rzzxk" podStartSLOduration=2.292727168 podStartE2EDuration="2.292727168s" podCreationTimestamp="2025-12-16 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:05:14.285949705 +0000 UTC m=+7792.774515828" watchObservedRunningTime="2025-12-16 09:05:14.292727168 +0000 UTC m=+7792.781293291" Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:14.456711 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-cbf489c5c-kdft6"] Dec 16 09:05:14 crc kubenswrapper[4823]: I1216 09:05:14.465622 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-755687f5b7-sp4tf"] Dec 16 09:05:15 crc kubenswrapper[4823]: I1216 09:05:15.286376 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" event={"ID":"db45071e-c05e-4a4e-8a88-1d6a4a8bd198","Type":"ContainerStarted","Data":"8bb46b61265175370a06d12339f4bb9fde10bbef6aed9c6e01dde56b02078808"} Dec 16 09:05:15 crc kubenswrapper[4823]: I1216 09:05:15.287829 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755687f5b7-sp4tf" event={"ID":"1b3ac823-aaa1-4209-8e6b-88350ffd7519","Type":"ContainerStarted","Data":"4e8cc30328dc729ab0f1d44ca803a0a92fb69cd282e934d0214a715037a3aa1d"} Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.317309 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" event={"ID":"db45071e-c05e-4a4e-8a88-1d6a4a8bd198","Type":"ContainerStarted","Data":"b50d813e87ca2d06f0dfabdd81e817f993d4fe9ec57071a05a3bbe50e7089491"} Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.317730 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.319841 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755687f5b7-sp4tf" event={"ID":"1b3ac823-aaa1-4209-8e6b-88350ffd7519","Type":"ContainerStarted","Data":"23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0"} Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.319982 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.338770 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" podStartSLOduration=3.490753912 podStartE2EDuration="6.338749718s" podCreationTimestamp="2025-12-16 09:05:12 +0000 UTC" firstStartedPulling="2025-12-16 09:05:14.482866018 +0000 UTC m=+7792.971432141" lastFinishedPulling="2025-12-16 09:05:17.330861814 +0000 UTC m=+7795.819427947" observedRunningTime="2025-12-16 09:05:18.336394814 +0000 UTC m=+7796.824960937" watchObservedRunningTime="2025-12-16 09:05:18.338749718 +0000 UTC m=+7796.827315841" Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.364000 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-755687f5b7-sp4tf" podStartSLOduration=3.51396315 podStartE2EDuration="6.363981428s" podCreationTimestamp="2025-12-16 09:05:12 +0000 UTC" firstStartedPulling="2025-12-16 09:05:14.482869519 +0000 UTC m=+7792.971435632" lastFinishedPulling="2025-12-16 09:05:17.332887787 +0000 UTC m=+7795.821453910" observedRunningTime="2025-12-16 09:05:18.363908426 +0000 UTC m=+7796.852474549" watchObservedRunningTime="2025-12-16 09:05:18.363981428 +0000 UTC m=+7796.852547541" Dec 16 09:05:18 crc kubenswrapper[4823]: I1216 09:05:18.771996 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:05:18 crc kubenswrapper[4823]: E1216 09:05:18.772564 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.206103 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-64f85d9856-wwkd5"] Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.207377 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.233693 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-64f85d9856-wwkd5"] Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.254441 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7dff5b4649-wzl8g"] Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.255676 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.266071 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-69cb598cbc-ht4ws"] Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.267297 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.307330 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7dff5b4649-wzl8g"] Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.322087 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-69cb598cbc-ht4ws"] Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.375666 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zspx\" (UniqueName: \"kubernetes.io/projected/7a613891-fc01-4f69-97a8-63cccc00f4a5-kube-api-access-6zspx\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.375857 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kf8g\" (UniqueName: \"kubernetes.io/projected/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-kube-api-access-5kf8g\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.375999 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data-custom\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.376074 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-combined-ca-bundle\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.376147 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.376201 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-combined-ca-bundle\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.376497 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.376809 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data-custom\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.478771 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.479130 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data-custom\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.479276 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data-custom\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.479426 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zspx\" (UniqueName: \"kubernetes.io/projected/7a613891-fc01-4f69-97a8-63cccc00f4a5-kube-api-access-6zspx\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.479554 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kf8g\" (UniqueName: \"kubernetes.io/projected/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-kube-api-access-5kf8g\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.479690 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data-custom\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.480774 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-combined-ca-bundle\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.480917 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.481066 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-combined-ca-bundle\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.481553 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-combined-ca-bundle\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.481913 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.482069 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jsq2\" (UniqueName: \"kubernetes.io/projected/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-kube-api-access-2jsq2\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.485892 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-combined-ca-bundle\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.486704 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data-custom\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.488876 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-combined-ca-bundle\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.489218 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.489235 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.492002 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data-custom\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.501136 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kf8g\" (UniqueName: \"kubernetes.io/projected/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-kube-api-access-5kf8g\") pod \"heat-cfnapi-69cb598cbc-ht4ws\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.504574 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zspx\" (UniqueName: \"kubernetes.io/projected/7a613891-fc01-4f69-97a8-63cccc00f4a5-kube-api-access-6zspx\") pod \"heat-engine-64f85d9856-wwkd5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.530057 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.583994 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jsq2\" (UniqueName: \"kubernetes.io/projected/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-kube-api-access-2jsq2\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.584431 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.584523 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data-custom\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.584667 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-combined-ca-bundle\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.590369 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-combined-ca-bundle\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.591107 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data-custom\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.591628 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.592194 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.608383 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jsq2\" (UniqueName: \"kubernetes.io/projected/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-kube-api-access-2jsq2\") pod \"heat-api-7dff5b4649-wzl8g\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:20 crc kubenswrapper[4823]: I1216 09:05:20.878693 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:21 crc kubenswrapper[4823]: I1216 09:05:21.139677 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-64f85d9856-wwkd5"] Dec 16 09:05:21 crc kubenswrapper[4823]: I1216 09:05:21.263865 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-69cb598cbc-ht4ws"] Dec 16 09:05:21 crc kubenswrapper[4823]: I1216 09:05:21.350669 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64f85d9856-wwkd5" event={"ID":"7a613891-fc01-4f69-97a8-63cccc00f4a5","Type":"ContainerStarted","Data":"22af3c3c8c408d46a4187edafef634d4aaa3ad82741ae61f288fdc5e4fedee2b"} Dec 16 09:05:21 crc kubenswrapper[4823]: I1216 09:05:21.431354 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7dff5b4649-wzl8g"] Dec 16 09:05:21 crc kubenswrapper[4823]: W1216 09:05:21.472763 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfdf0d43_8dd3_4c5d_a7a3_244ba1262194.slice/crio-70ceef6a42370249c2380c802a13a20f5b2daf769deb7a35af59a59f3bcb21ae WatchSource:0}: Error finding container 70ceef6a42370249c2380c802a13a20f5b2daf769deb7a35af59a59f3bcb21ae: Status 404 returned error can't find the container with id 70ceef6a42370249c2380c802a13a20f5b2daf769deb7a35af59a59f3bcb21ae Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.144799 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-755687f5b7-sp4tf"] Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.146180 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-755687f5b7-sp4tf" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" containerID="cri-o://23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0" gracePeriod=60 Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.158415 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-755687f5b7-sp4tf" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.116:8004/healthcheck\": EOF" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.178744 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-cbf489c5c-kdft6"] Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.178992 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerName="heat-cfnapi" containerID="cri-o://b50d813e87ca2d06f0dfabdd81e817f993d4fe9ec57071a05a3bbe50e7089491" gracePeriod=60 Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.214066 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-ddfd865c7-nhsh6"] Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.220351 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.224440 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.224680 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.237071 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-8589448fc-qj569"] Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.238747 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.246285 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.255494 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.306622 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-ddfd865c7-nhsh6"] Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326182 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2z4\" (UniqueName: \"kubernetes.io/projected/f8b8d93d-24db-4382-9077-7404605c7cf1-kube-api-access-xr2z4\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326329 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data-custom\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326483 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9qj\" (UniqueName: \"kubernetes.io/projected/7f35ecc1-21e4-461e-91d3-3da96745fed6-kube-api-access-4x9qj\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326550 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326637 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-public-tls-certs\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326663 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326683 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-internal-tls-certs\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326745 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-public-tls-certs\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326785 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-combined-ca-bundle\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326804 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-combined-ca-bundle\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326940 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data-custom\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.326989 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-internal-tls-certs\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.342507 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8589448fc-qj569"] Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.367903 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64f85d9856-wwkd5" event={"ID":"7a613891-fc01-4f69-97a8-63cccc00f4a5","Type":"ContainerStarted","Data":"6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092"} Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.369133 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.372721 4823 generic.go:334] "Generic (PLEG): container finished" podID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerID="8b5f5c4ca59c97661fcec7550f8fb38179880139bbed6df699e442ea202d609e" exitCode=1 Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.372777 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" event={"ID":"8e5d9a4f-577d-4ecb-9950-bbf018edbd04","Type":"ContainerDied","Data":"8b5f5c4ca59c97661fcec7550f8fb38179880139bbed6df699e442ea202d609e"} Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.372800 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" event={"ID":"8e5d9a4f-577d-4ecb-9950-bbf018edbd04","Type":"ContainerStarted","Data":"b6eaad087b32522a1ed9ea70ca5ad094e814259cda7fc6081e348c4386454200"} Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.373410 4823 scope.go:117] "RemoveContainer" containerID="8b5f5c4ca59c97661fcec7550f8fb38179880139bbed6df699e442ea202d609e" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.376258 4823 generic.go:334] "Generic (PLEG): container finished" podID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerID="e5ac258b5417bb514321b96e48951b0c92e9ceb7e9c1573718380fd62dc4588c" exitCode=1 Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.376288 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dff5b4649-wzl8g" event={"ID":"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194","Type":"ContainerDied","Data":"e5ac258b5417bb514321b96e48951b0c92e9ceb7e9c1573718380fd62dc4588c"} Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.376309 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dff5b4649-wzl8g" event={"ID":"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194","Type":"ContainerStarted","Data":"70ceef6a42370249c2380c802a13a20f5b2daf769deb7a35af59a59f3bcb21ae"} Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.376626 4823 scope.go:117] "RemoveContainer" containerID="e5ac258b5417bb514321b96e48951b0c92e9ceb7e9c1573718380fd62dc4588c" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.402794 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-64f85d9856-wwkd5" podStartSLOduration=2.402778291 podStartE2EDuration="2.402778291s" podCreationTimestamp="2025-12-16 09:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:05:22.396934998 +0000 UTC m=+7800.885501121" watchObservedRunningTime="2025-12-16 09:05:22.402778291 +0000 UTC m=+7800.891344404" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429187 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-combined-ca-bundle\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429246 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-combined-ca-bundle\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429332 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data-custom\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429394 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-internal-tls-certs\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429499 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2z4\" (UniqueName: \"kubernetes.io/projected/f8b8d93d-24db-4382-9077-7404605c7cf1-kube-api-access-xr2z4\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429525 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data-custom\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429588 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9qj\" (UniqueName: \"kubernetes.io/projected/7f35ecc1-21e4-461e-91d3-3da96745fed6-kube-api-access-4x9qj\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429623 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429718 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-public-tls-certs\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429763 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429787 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-internal-tls-certs\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.429832 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-public-tls-certs\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.441442 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.447956 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-public-tls-certs\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.448971 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-combined-ca-bundle\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.451608 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-internal-tls-certs\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.455567 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.456106 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data-custom\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.456912 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data-custom\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.463863 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-public-tls-certs\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.466661 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-combined-ca-bundle\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.467796 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9qj\" (UniqueName: \"kubernetes.io/projected/7f35ecc1-21e4-461e-91d3-3da96745fed6-kube-api-access-4x9qj\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.473624 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-internal-tls-certs\") pod \"heat-cfnapi-8589448fc-qj569\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.479846 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2z4\" (UniqueName: \"kubernetes.io/projected/f8b8d93d-24db-4382-9077-7404605c7cf1-kube-api-access-xr2z4\") pod \"heat-api-ddfd865c7-nhsh6\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.619770 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:22 crc kubenswrapper[4823]: I1216 09:05:22.648882 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.045283 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.117:8000/healthcheck\": read tcp 10.217.0.2:36802->10.217.1.117:8000: read: connection reset by peer" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.278682 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.117:8000/healthcheck\": dial tcp 10.217.1.117:8000: connect: connection refused" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.401938 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-ddfd865c7-nhsh6"] Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.407648 4823 generic.go:334] "Generic (PLEG): container finished" podID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerID="b50d813e87ca2d06f0dfabdd81e817f993d4fe9ec57071a05a3bbe50e7089491" exitCode=0 Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.407769 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" event={"ID":"db45071e-c05e-4a4e-8a88-1d6a4a8bd198","Type":"ContainerDied","Data":"b50d813e87ca2d06f0dfabdd81e817f993d4fe9ec57071a05a3bbe50e7089491"} Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.445502 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" event={"ID":"8e5d9a4f-577d-4ecb-9950-bbf018edbd04","Type":"ContainerStarted","Data":"d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029"} Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.446501 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.450144 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dff5b4649-wzl8g" event={"ID":"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194","Type":"ContainerStarted","Data":"f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08"} Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.450353 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.506644 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" podStartSLOduration=3.5066118790000003 podStartE2EDuration="3.506611879s" podCreationTimestamp="2025-12-16 09:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:05:23.482476253 +0000 UTC m=+7801.971042376" watchObservedRunningTime="2025-12-16 09:05:23.506611879 +0000 UTC m=+7801.995177992" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.532203 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7dff5b4649-wzl8g" podStartSLOduration=3.532176358 podStartE2EDuration="3.532176358s" podCreationTimestamp="2025-12-16 09:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:05:23.507123174 +0000 UTC m=+7801.995689307" watchObservedRunningTime="2025-12-16 09:05:23.532176358 +0000 UTC m=+7802.020742491" Dec 16 09:05:23 crc kubenswrapper[4823]: I1216 09:05:23.624336 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8589448fc-qj569"] Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.045074 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.193205 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data\") pod \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.193296 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-combined-ca-bundle\") pod \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.193325 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data-custom\") pod \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.193395 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g78ls\" (UniqueName: \"kubernetes.io/projected/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-kube-api-access-g78ls\") pod \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\" (UID: \"db45071e-c05e-4a4e-8a88-1d6a4a8bd198\") " Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.198674 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "db45071e-c05e-4a4e-8a88-1d6a4a8bd198" (UID: "db45071e-c05e-4a4e-8a88-1d6a4a8bd198"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.202293 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-kube-api-access-g78ls" (OuterVolumeSpecName: "kube-api-access-g78ls") pod "db45071e-c05e-4a4e-8a88-1d6a4a8bd198" (UID: "db45071e-c05e-4a4e-8a88-1d6a4a8bd198"). InnerVolumeSpecName "kube-api-access-g78ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.234735 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.239297 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db45071e-c05e-4a4e-8a88-1d6a4a8bd198" (UID: "db45071e-c05e-4a4e-8a88-1d6a4a8bd198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.296872 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.297330 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.297418 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g78ls\" (UniqueName: \"kubernetes.io/projected/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-kube-api-access-g78ls\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.308244 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data" (OuterVolumeSpecName: "config-data") pod "db45071e-c05e-4a4e-8a88-1d6a4a8bd198" (UID: "db45071e-c05e-4a4e-8a88-1d6a4a8bd198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.399948 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db45071e-c05e-4a4e-8a88-1d6a4a8bd198-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.497507 4823 generic.go:334] "Generic (PLEG): container finished" podID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerID="d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029" exitCode=1 Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.497563 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" event={"ID":"8e5d9a4f-577d-4ecb-9950-bbf018edbd04","Type":"ContainerDied","Data":"d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.497598 4823 scope.go:117] "RemoveContainer" containerID="8b5f5c4ca59c97661fcec7550f8fb38179880139bbed6df699e442ea202d609e" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.498265 4823 scope.go:117] "RemoveContainer" containerID="d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029" Dec 16 09:05:24 crc kubenswrapper[4823]: E1216 09:05:24.498477 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-69cb598cbc-ht4ws_openstack(8e5d9a4f-577d-4ecb-9950-bbf018edbd04)\"" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.507829 4823 generic.go:334] "Generic (PLEG): container finished" podID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerID="f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08" exitCode=1 Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.507923 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dff5b4649-wzl8g" event={"ID":"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194","Type":"ContainerDied","Data":"f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.508473 4823 scope.go:117] "RemoveContainer" containerID="f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08" Dec 16 09:05:24 crc kubenswrapper[4823]: E1216 09:05:24.508725 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7dff5b4649-wzl8g_openstack(dfdf0d43-8dd3-4c5d-a7a3-244ba1262194)\"" pod="openstack/heat-api-7dff5b4649-wzl8g" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.515619 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8589448fc-qj569" event={"ID":"7f35ecc1-21e4-461e-91d3-3da96745fed6","Type":"ContainerStarted","Data":"44afd549f065376806e3735489d6257a4793e59063b189217a6eecd50e0f1af0"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.515667 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.515680 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8589448fc-qj569" event={"ID":"7f35ecc1-21e4-461e-91d3-3da96745fed6","Type":"ContainerStarted","Data":"a02a72e5bc1993985df5b00f6e12bac6bfccaa68ebd6fc0301f5b7f0313e55db"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.545281 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" event={"ID":"db45071e-c05e-4a4e-8a88-1d6a4a8bd198","Type":"ContainerDied","Data":"8bb46b61265175370a06d12339f4bb9fde10bbef6aed9c6e01dde56b02078808"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.545362 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cbf489c5c-kdft6" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.558766 4823 scope.go:117] "RemoveContainer" containerID="e5ac258b5417bb514321b96e48951b0c92e9ceb7e9c1573718380fd62dc4588c" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.558879 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-ddfd865c7-nhsh6" event={"ID":"f8b8d93d-24db-4382-9077-7404605c7cf1","Type":"ContainerStarted","Data":"9e972ac6360c9fcb45fd20ef40bf7e8972136fa235df75bc1dfcacbfb25e23ed"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.558921 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.558932 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-ddfd865c7-nhsh6" event={"ID":"f8b8d93d-24db-4382-9077-7404605c7cf1","Type":"ContainerStarted","Data":"7ef8314d95f6bf55c632d1c16580e1cd82d6d9038b9cb26fdf59a1511cbce48b"} Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.593462 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-8589448fc-qj569" podStartSLOduration=2.5934418729999997 podStartE2EDuration="2.593441873s" podCreationTimestamp="2025-12-16 09:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:05:24.582335805 +0000 UTC m=+7803.070901928" watchObservedRunningTime="2025-12-16 09:05:24.593441873 +0000 UTC m=+7803.082007996" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.678786 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-ddfd865c7-nhsh6" podStartSLOduration=2.6787622840000003 podStartE2EDuration="2.678762284s" podCreationTimestamp="2025-12-16 09:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:05:24.667070668 +0000 UTC m=+7803.155636811" watchObservedRunningTime="2025-12-16 09:05:24.678762284 +0000 UTC m=+7803.167328407" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.724287 4823 scope.go:117] "RemoveContainer" containerID="b50d813e87ca2d06f0dfabdd81e817f993d4fe9ec57071a05a3bbe50e7089491" Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.762435 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-cbf489c5c-kdft6"] Dec 16 09:05:24 crc kubenswrapper[4823]: I1216 09:05:24.786001 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-cbf489c5c-kdft6"] Dec 16 09:05:25 crc kubenswrapper[4823]: I1216 09:05:25.575406 4823 scope.go:117] "RemoveContainer" containerID="d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029" Dec 16 09:05:25 crc kubenswrapper[4823]: E1216 09:05:25.575634 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-69cb598cbc-ht4ws_openstack(8e5d9a4f-577d-4ecb-9950-bbf018edbd04)\"" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" Dec 16 09:05:25 crc kubenswrapper[4823]: I1216 09:05:25.580903 4823 scope.go:117] "RemoveContainer" containerID="f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08" Dec 16 09:05:25 crc kubenswrapper[4823]: E1216 09:05:25.581234 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7dff5b4649-wzl8g_openstack(dfdf0d43-8dd3-4c5d-a7a3-244ba1262194)\"" pod="openstack/heat-api-7dff5b4649-wzl8g" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" Dec 16 09:05:25 crc kubenswrapper[4823]: I1216 09:05:25.601167 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:25 crc kubenswrapper[4823]: I1216 09:05:25.783990 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" path="/var/lib/kubelet/pods/db45071e-c05e-4a4e-8a88-1d6a4a8bd198/volumes" Dec 16 09:05:25 crc kubenswrapper[4823]: I1216 09:05:25.880183 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:26 crc kubenswrapper[4823]: I1216 09:05:26.591361 4823 scope.go:117] "RemoveContainer" containerID="d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029" Dec 16 09:05:26 crc kubenswrapper[4823]: E1216 09:05:26.591801 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-69cb598cbc-ht4ws_openstack(8e5d9a4f-577d-4ecb-9950-bbf018edbd04)\"" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" Dec 16 09:05:26 crc kubenswrapper[4823]: I1216 09:05:26.592688 4823 scope.go:117] "RemoveContainer" containerID="f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08" Dec 16 09:05:26 crc kubenswrapper[4823]: E1216 09:05:26.592897 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7dff5b4649-wzl8g_openstack(dfdf0d43-8dd3-4c5d-a7a3-244ba1262194)\"" pod="openstack/heat-api-7dff5b4649-wzl8g" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" Dec 16 09:05:26 crc kubenswrapper[4823]: I1216 09:05:26.772801 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:05:26 crc kubenswrapper[4823]: I1216 09:05:26.876199 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fc95447c4-jfpp8"] Dec 16 09:05:26 crc kubenswrapper[4823]: I1216 09:05:26.876794 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fc95447c4-jfpp8" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon-log" containerID="cri-o://304daf66f20219103734d58ce5cff3122507318b7f01170a8de3b259a5a31f50" gracePeriod=30 Dec 16 09:05:26 crc kubenswrapper[4823]: I1216 09:05:26.877330 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fc95447c4-jfpp8" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" containerID="cri-o://ea9aa1f918811c0b1fc9ff20658bcd5bde81f67878b0d287ad886928e5de1fba" gracePeriod=30 Dec 16 09:05:28 crc kubenswrapper[4823]: I1216 09:05:28.593590 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-755687f5b7-sp4tf" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.116:8004/healthcheck\": read tcp 10.217.0.2:57600->10.217.1.116:8004: read: connection reset by peer" Dec 16 09:05:28 crc kubenswrapper[4823]: I1216 09:05:28.594464 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-755687f5b7-sp4tf" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.116:8004/healthcheck\": dial tcp 10.217.1.116:8004: connect: connection refused" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.136166 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.221150 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data-custom\") pod \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.221289 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-combined-ca-bundle\") pod \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.221377 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gpgb\" (UniqueName: \"kubernetes.io/projected/1b3ac823-aaa1-4209-8e6b-88350ffd7519-kube-api-access-7gpgb\") pod \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.221556 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data\") pod \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\" (UID: \"1b3ac823-aaa1-4209-8e6b-88350ffd7519\") " Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.226576 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1b3ac823-aaa1-4209-8e6b-88350ffd7519" (UID: "1b3ac823-aaa1-4209-8e6b-88350ffd7519"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.227009 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3ac823-aaa1-4209-8e6b-88350ffd7519-kube-api-access-7gpgb" (OuterVolumeSpecName: "kube-api-access-7gpgb") pod "1b3ac823-aaa1-4209-8e6b-88350ffd7519" (UID: "1b3ac823-aaa1-4209-8e6b-88350ffd7519"). InnerVolumeSpecName "kube-api-access-7gpgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.249270 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b3ac823-aaa1-4209-8e6b-88350ffd7519" (UID: "1b3ac823-aaa1-4209-8e6b-88350ffd7519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.273189 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data" (OuterVolumeSpecName: "config-data") pod "1b3ac823-aaa1-4209-8e6b-88350ffd7519" (UID: "1b3ac823-aaa1-4209-8e6b-88350ffd7519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.323953 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.323992 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.324003 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gpgb\" (UniqueName: \"kubernetes.io/projected/1b3ac823-aaa1-4209-8e6b-88350ffd7519-kube-api-access-7gpgb\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.324019 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3ac823-aaa1-4209-8e6b-88350ffd7519-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.618557 4823 generic.go:334] "Generic (PLEG): container finished" podID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerID="23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0" exitCode=0 Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.618610 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755687f5b7-sp4tf" event={"ID":"1b3ac823-aaa1-4209-8e6b-88350ffd7519","Type":"ContainerDied","Data":"23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0"} Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.618634 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755687f5b7-sp4tf" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.618660 4823 scope.go:117] "RemoveContainer" containerID="23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.618646 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755687f5b7-sp4tf" event={"ID":"1b3ac823-aaa1-4209-8e6b-88350ffd7519","Type":"ContainerDied","Data":"4e8cc30328dc729ab0f1d44ca803a0a92fb69cd282e934d0214a715037a3aa1d"} Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.648367 4823 scope.go:117] "RemoveContainer" containerID="23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0" Dec 16 09:05:29 crc kubenswrapper[4823]: E1216 09:05:29.648833 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0\": container with ID starting with 23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0 not found: ID does not exist" containerID="23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.648864 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0"} err="failed to get container status \"23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0\": rpc error: code = NotFound desc = could not find container \"23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0\": container with ID starting with 23dfe5272aafae3411bcc9e20ef3bec3896246f4a96109e136626c13b00596f0 not found: ID does not exist" Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.717529 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-755687f5b7-sp4tf"] Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.729396 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-755687f5b7-sp4tf"] Dec 16 09:05:29 crc kubenswrapper[4823]: I1216 09:05:29.786075 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" path="/var/lib/kubelet/pods/1b3ac823-aaa1-4209-8e6b-88350ffd7519/volumes" Dec 16 09:05:30 crc kubenswrapper[4823]: I1216 09:05:30.637277 4823 generic.go:334] "Generic (PLEG): container finished" podID="28373e9d-544d-40d4-8517-51e6718b9493" containerID="ea9aa1f918811c0b1fc9ff20658bcd5bde81f67878b0d287ad886928e5de1fba" exitCode=0 Dec 16 09:05:30 crc kubenswrapper[4823]: I1216 09:05:30.637351 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc95447c4-jfpp8" event={"ID":"28373e9d-544d-40d4-8517-51e6718b9493","Type":"ContainerDied","Data":"ea9aa1f918811c0b1fc9ff20658bcd5bde81f67878b0d287ad886928e5de1fba"} Dec 16 09:05:31 crc kubenswrapper[4823]: I1216 09:05:31.791781 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:05:31 crc kubenswrapper[4823]: E1216 09:05:31.792156 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:05:33 crc kubenswrapper[4823]: I1216 09:05:33.021809 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.012672 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.083634 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7dff5b4649-wzl8g"] Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.184097 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.256160 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-69cb598cbc-ht4ws"] Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.545966 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.632627 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data-custom\") pod \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.632795 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data\") pod \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.632891 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-combined-ca-bundle\") pod \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.632946 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jsq2\" (UniqueName: \"kubernetes.io/projected/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-kube-api-access-2jsq2\") pod \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\" (UID: \"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.638977 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" (UID: "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.639307 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-kube-api-access-2jsq2" (OuterVolumeSpecName: "kube-api-access-2jsq2") pod "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" (UID: "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194"). InnerVolumeSpecName "kube-api-access-2jsq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.706444 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" (UID: "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.713516 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dff5b4649-wzl8g" event={"ID":"dfdf0d43-8dd3-4c5d-a7a3-244ba1262194","Type":"ContainerDied","Data":"70ceef6a42370249c2380c802a13a20f5b2daf769deb7a35af59a59f3bcb21ae"} Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.713670 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dff5b4649-wzl8g" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.713815 4823 scope.go:117] "RemoveContainer" containerID="f2649af54fbf7bcceac97ad5fdbae7a2088fe288f127187568b90a587d9dcc08" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.730905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" event={"ID":"8e5d9a4f-577d-4ecb-9950-bbf018edbd04","Type":"ContainerDied","Data":"b6eaad087b32522a1ed9ea70ca5ad094e814259cda7fc6081e348c4386454200"} Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.730948 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6eaad087b32522a1ed9ea70ca5ad094e814259cda7fc6081e348c4386454200" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.734993 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.735050 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.735062 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jsq2\" (UniqueName: \"kubernetes.io/projected/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-kube-api-access-2jsq2\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.738657 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data" (OuterVolumeSpecName: "config-data") pod "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" (UID: "dfdf0d43-8dd3-4c5d-a7a3-244ba1262194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.761607 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.836994 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.938051 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kf8g\" (UniqueName: \"kubernetes.io/projected/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-kube-api-access-5kf8g\") pod \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.938459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-combined-ca-bundle\") pod \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.938615 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data\") pod \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.938645 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data-custom\") pod \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\" (UID: \"8e5d9a4f-577d-4ecb-9950-bbf018edbd04\") " Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.943137 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8e5d9a4f-577d-4ecb-9950-bbf018edbd04" (UID: "8e5d9a4f-577d-4ecb-9950-bbf018edbd04"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.943900 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-kube-api-access-5kf8g" (OuterVolumeSpecName: "kube-api-access-5kf8g") pod "8e5d9a4f-577d-4ecb-9950-bbf018edbd04" (UID: "8e5d9a4f-577d-4ecb-9950-bbf018edbd04"). InnerVolumeSpecName "kube-api-access-5kf8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:34 crc kubenswrapper[4823]: I1216 09:05:34.975349 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e5d9a4f-577d-4ecb-9950-bbf018edbd04" (UID: "8e5d9a4f-577d-4ecb-9950-bbf018edbd04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.016296 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data" (OuterVolumeSpecName: "config-data") pod "8e5d9a4f-577d-4ecb-9950-bbf018edbd04" (UID: "8e5d9a4f-577d-4ecb-9950-bbf018edbd04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.045824 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.045865 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.045879 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kf8g\" (UniqueName: \"kubernetes.io/projected/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-kube-api-access-5kf8g\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.045893 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e5d9a4f-577d-4ecb-9950-bbf018edbd04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.050063 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7dff5b4649-wzl8g"] Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.059535 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7dff5b4649-wzl8g"] Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.742174 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69cb598cbc-ht4ws" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.806321 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fc95447c4-jfpp8" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.108:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8443: connect: connection refused" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.820725 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" path="/var/lib/kubelet/pods/dfdf0d43-8dd3-4c5d-a7a3-244ba1262194/volumes" Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.827347 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-69cb598cbc-ht4ws"] Dec 16 09:05:35 crc kubenswrapper[4823]: I1216 09:05:35.836579 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-69cb598cbc-ht4ws"] Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.047830 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rgmsv"] Dec 16 09:05:37 crc kubenswrapper[4823]: E1216 09:05:37.048681 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.048700 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: E1216 09:05:37.048719 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.048727 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: E1216 09:05:37.048738 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.048747 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: E1216 09:05:37.048775 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.048784 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: E1216 09:05:37.048802 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.048810 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049049 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3ac823-aaa1-4209-8e6b-88350ffd7519" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049080 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049094 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049106 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdf0d43-8dd3-4c5d-a7a3-244ba1262194" containerName="heat-api" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049122 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049138 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="db45071e-c05e-4a4e-8a88-1d6a4a8bd198" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: E1216 09:05:37.049400 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.049421 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" containerName="heat-cfnapi" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.051077 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.059235 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgmsv"] Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.192300 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-catalog-content\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.192390 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-utilities\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.192443 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzkbj\" (UniqueName: \"kubernetes.io/projected/7948501f-8639-4c51-993a-c829939e6148-kube-api-access-qzkbj\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.294095 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-utilities\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.294182 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzkbj\" (UniqueName: \"kubernetes.io/projected/7948501f-8639-4c51-993a-c829939e6148-kube-api-access-qzkbj\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.294368 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-catalog-content\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.294643 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-utilities\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.294712 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-catalog-content\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.315828 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzkbj\" (UniqueName: \"kubernetes.io/projected/7948501f-8639-4c51-993a-c829939e6148-kube-api-access-qzkbj\") pod \"redhat-marketplace-rgmsv\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.378293 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.788405 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5d9a4f-577d-4ecb-9950-bbf018edbd04" path="/var/lib/kubelet/pods/8e5d9a4f-577d-4ecb-9950-bbf018edbd04/volumes" Dec 16 09:05:37 crc kubenswrapper[4823]: I1216 09:05:37.880310 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgmsv"] Dec 16 09:05:37 crc kubenswrapper[4823]: W1216 09:05:37.888929 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7948501f_8639_4c51_993a_c829939e6148.slice/crio-9a2fe3704b19686badd4662b6be48d8004412975ef0c92e47a63a1f51ec162f9 WatchSource:0}: Error finding container 9a2fe3704b19686badd4662b6be48d8004412975ef0c92e47a63a1f51ec162f9: Status 404 returned error can't find the container with id 9a2fe3704b19686badd4662b6be48d8004412975ef0c92e47a63a1f51ec162f9 Dec 16 09:05:38 crc kubenswrapper[4823]: I1216 09:05:38.784703 4823 generic.go:334] "Generic (PLEG): container finished" podID="7948501f-8639-4c51-993a-c829939e6148" containerID="7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b" exitCode=0 Dec 16 09:05:38 crc kubenswrapper[4823]: I1216 09:05:38.784807 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgmsv" event={"ID":"7948501f-8639-4c51-993a-c829939e6148","Type":"ContainerDied","Data":"7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b"} Dec 16 09:05:38 crc kubenswrapper[4823]: I1216 09:05:38.785148 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgmsv" event={"ID":"7948501f-8639-4c51-993a-c829939e6148","Type":"ContainerStarted","Data":"9a2fe3704b19686badd4662b6be48d8004412975ef0c92e47a63a1f51ec162f9"} Dec 16 09:05:40 crc kubenswrapper[4823]: I1216 09:05:40.559842 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:05:40 crc kubenswrapper[4823]: I1216 09:05:40.615874 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7b85d8c859-rzzxk"] Dec 16 09:05:40 crc kubenswrapper[4823]: I1216 09:05:40.616544 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7b85d8c859-rzzxk" podUID="7c5e1025-1368-4594-88d1-16ef8ccadccb" containerName="heat-engine" containerID="cri-o://615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b" gracePeriod=60 Dec 16 09:05:40 crc kubenswrapper[4823]: I1216 09:05:40.811157 4823 generic.go:334] "Generic (PLEG): container finished" podID="7948501f-8639-4c51-993a-c829939e6148" containerID="54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd" exitCode=0 Dec 16 09:05:40 crc kubenswrapper[4823]: I1216 09:05:40.811198 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgmsv" event={"ID":"7948501f-8639-4c51-993a-c829939e6148","Type":"ContainerDied","Data":"54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd"} Dec 16 09:05:42 crc kubenswrapper[4823]: E1216 09:05:42.987160 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:05:42 crc kubenswrapper[4823]: E1216 09:05:42.989568 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:05:42 crc kubenswrapper[4823]: E1216 09:05:42.992920 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:05:42 crc kubenswrapper[4823]: E1216 09:05:42.992971 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7b85d8c859-rzzxk" podUID="7c5e1025-1368-4594-88d1-16ef8ccadccb" containerName="heat-engine" Dec 16 09:05:44 crc kubenswrapper[4823]: I1216 09:05:44.877866 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgmsv" event={"ID":"7948501f-8639-4c51-993a-c829939e6148","Type":"ContainerStarted","Data":"daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa"} Dec 16 09:05:44 crc kubenswrapper[4823]: I1216 09:05:44.907616 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rgmsv" podStartSLOduration=2.070643874 podStartE2EDuration="7.907589825s" podCreationTimestamp="2025-12-16 09:05:37 +0000 UTC" firstStartedPulling="2025-12-16 09:05:38.786360907 +0000 UTC m=+7817.274927030" lastFinishedPulling="2025-12-16 09:05:44.623306858 +0000 UTC m=+7823.111872981" observedRunningTime="2025-12-16 09:05:44.900206064 +0000 UTC m=+7823.388772187" watchObservedRunningTime="2025-12-16 09:05:44.907589825 +0000 UTC m=+7823.396155948" Dec 16 09:05:45 crc kubenswrapper[4823]: I1216 09:05:45.771543 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:05:45 crc kubenswrapper[4823]: E1216 09:05:45.772125 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:05:45 crc kubenswrapper[4823]: I1216 09:05:45.792492 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fc95447c4-jfpp8" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.108:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8443: connect: connection refused" Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.899671 4823 generic.go:334] "Generic (PLEG): container finished" podID="7c5e1025-1368-4594-88d1-16ef8ccadccb" containerID="615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b" exitCode=0 Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.899762 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b85d8c859-rzzxk" event={"ID":"7c5e1025-1368-4594-88d1-16ef8ccadccb","Type":"ContainerDied","Data":"615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b"} Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.899915 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b85d8c859-rzzxk" event={"ID":"7c5e1025-1368-4594-88d1-16ef8ccadccb","Type":"ContainerDied","Data":"efe37f699fa57f7695daa66dccba36b2bae74a08e7bc6dc19a25665a085055d3"} Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.899929 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe37f699fa57f7695daa66dccba36b2bae74a08e7bc6dc19a25665a085055d3" Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.906628 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.998929 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-combined-ca-bundle\") pod \"7c5e1025-1368-4594-88d1-16ef8ccadccb\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.999076 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data\") pod \"7c5e1025-1368-4594-88d1-16ef8ccadccb\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.999177 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data-custom\") pod \"7c5e1025-1368-4594-88d1-16ef8ccadccb\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " Dec 16 09:05:46 crc kubenswrapper[4823]: I1216 09:05:46.999234 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr87p\" (UniqueName: \"kubernetes.io/projected/7c5e1025-1368-4594-88d1-16ef8ccadccb-kube-api-access-qr87p\") pod \"7c5e1025-1368-4594-88d1-16ef8ccadccb\" (UID: \"7c5e1025-1368-4594-88d1-16ef8ccadccb\") " Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.006562 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5e1025-1368-4594-88d1-16ef8ccadccb-kube-api-access-qr87p" (OuterVolumeSpecName: "kube-api-access-qr87p") pod "7c5e1025-1368-4594-88d1-16ef8ccadccb" (UID: "7c5e1025-1368-4594-88d1-16ef8ccadccb"). InnerVolumeSpecName "kube-api-access-qr87p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.007576 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7c5e1025-1368-4594-88d1-16ef8ccadccb" (UID: "7c5e1025-1368-4594-88d1-16ef8ccadccb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.035264 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c5e1025-1368-4594-88d1-16ef8ccadccb" (UID: "7c5e1025-1368-4594-88d1-16ef8ccadccb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.053050 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data" (OuterVolumeSpecName: "config-data") pod "7c5e1025-1368-4594-88d1-16ef8ccadccb" (UID: "7c5e1025-1368-4594-88d1-16ef8ccadccb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.101911 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.102187 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.102344 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c5e1025-1368-4594-88d1-16ef8ccadccb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.102410 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr87p\" (UniqueName: \"kubernetes.io/projected/7c5e1025-1368-4594-88d1-16ef8ccadccb-kube-api-access-qr87p\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.378841 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.379870 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.436627 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.908744 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b85d8c859-rzzxk" Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.937014 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7b85d8c859-rzzxk"] Dec 16 09:05:47 crc kubenswrapper[4823]: I1216 09:05:47.944862 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7b85d8c859-rzzxk"] Dec 16 09:05:49 crc kubenswrapper[4823]: I1216 09:05:49.785686 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c5e1025-1368-4594-88d1-16ef8ccadccb" path="/var/lib/kubelet/pods/7c5e1025-1368-4594-88d1-16ef8ccadccb/volumes" Dec 16 09:05:50 crc kubenswrapper[4823]: I1216 09:05:50.693198 4823 scope.go:117] "RemoveContainer" containerID="4f41f29f42bb5b92497e6cdca74dd5b4d9e606a6e5a1ae0e5dc9cb55b58b1a17" Dec 16 09:05:50 crc kubenswrapper[4823]: I1216 09:05:50.737886 4823 scope.go:117] "RemoveContainer" containerID="384accd749147f19fd68cceb4a058774a1097c1202d7bd5ae02b4ddb42c1eecb" Dec 16 09:05:50 crc kubenswrapper[4823]: I1216 09:05:50.766420 4823 scope.go:117] "RemoveContainer" containerID="bf0a2dbbafb50fa3bc8becf59e754068301fe0b4d4c4db2712ebe2dd0cf00906" Dec 16 09:05:55 crc kubenswrapper[4823]: I1216 09:05:55.793519 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fc95447c4-jfpp8" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.108:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8443: connect: connection refused" Dec 16 09:05:55 crc kubenswrapper[4823]: I1216 09:05:55.794129 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.001892 4823 generic.go:334] "Generic (PLEG): container finished" podID="28373e9d-544d-40d4-8517-51e6718b9493" containerID="304daf66f20219103734d58ce5cff3122507318b7f01170a8de3b259a5a31f50" exitCode=137 Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.001999 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc95447c4-jfpp8" event={"ID":"28373e9d-544d-40d4-8517-51e6718b9493","Type":"ContainerDied","Data":"304daf66f20219103734d58ce5cff3122507318b7f01170a8de3b259a5a31f50"} Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.278700 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387182 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28373e9d-544d-40d4-8517-51e6718b9493-logs\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387247 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-secret-key\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387279 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-tls-certs\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387364 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-combined-ca-bundle\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387405 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx2zf\" (UniqueName: \"kubernetes.io/projected/28373e9d-544d-40d4-8517-51e6718b9493-kube-api-access-bx2zf\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387464 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-config-data\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.387585 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-scripts\") pod \"28373e9d-544d-40d4-8517-51e6718b9493\" (UID: \"28373e9d-544d-40d4-8517-51e6718b9493\") " Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.389159 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28373e9d-544d-40d4-8517-51e6718b9493-logs" (OuterVolumeSpecName: "logs") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.393372 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.393534 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28373e9d-544d-40d4-8517-51e6718b9493-kube-api-access-bx2zf" (OuterVolumeSpecName: "kube-api-access-bx2zf") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "kube-api-access-bx2zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.415642 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-config-data" (OuterVolumeSpecName: "config-data") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.431767 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.431803 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.444260 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-scripts" (OuterVolumeSpecName: "scripts") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.461334 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "28373e9d-544d-40d4-8517-51e6718b9493" (UID: "28373e9d-544d-40d4-8517-51e6718b9493"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.490523 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgmsv"] Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495422 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495471 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28373e9d-544d-40d4-8517-51e6718b9493-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495492 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495521 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495534 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28373e9d-544d-40d4-8517-51e6718b9493-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495549 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx2zf\" (UniqueName: \"kubernetes.io/projected/28373e9d-544d-40d4-8517-51e6718b9493-kube-api-access-bx2zf\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:57 crc kubenswrapper[4823]: I1216 09:05:57.495565 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28373e9d-544d-40d4-8517-51e6718b9493-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.029591 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fc95447c4-jfpp8" event={"ID":"28373e9d-544d-40d4-8517-51e6718b9493","Type":"ContainerDied","Data":"25898d18ef62d6635eb51cf6f19ca53fd4dab16358f4d2518602301e735c70cd"} Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.029627 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fc95447c4-jfpp8" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.029660 4823 scope.go:117] "RemoveContainer" containerID="ea9aa1f918811c0b1fc9ff20658bcd5bde81f67878b0d287ad886928e5de1fba" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.029962 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rgmsv" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="registry-server" containerID="cri-o://daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa" gracePeriod=2 Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.078126 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fc95447c4-jfpp8"] Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.087759 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fc95447c4-jfpp8"] Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.216476 4823 scope.go:117] "RemoveContainer" containerID="304daf66f20219103734d58ce5cff3122507318b7f01170a8de3b259a5a31f50" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.572918 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.718800 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-utilities\") pod \"7948501f-8639-4c51-993a-c829939e6148\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.718866 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-catalog-content\") pod \"7948501f-8639-4c51-993a-c829939e6148\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.718946 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzkbj\" (UniqueName: \"kubernetes.io/projected/7948501f-8639-4c51-993a-c829939e6148-kube-api-access-qzkbj\") pod \"7948501f-8639-4c51-993a-c829939e6148\" (UID: \"7948501f-8639-4c51-993a-c829939e6148\") " Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.719530 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-utilities" (OuterVolumeSpecName: "utilities") pod "7948501f-8639-4c51-993a-c829939e6148" (UID: "7948501f-8639-4c51-993a-c829939e6148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.724471 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7948501f-8639-4c51-993a-c829939e6148-kube-api-access-qzkbj" (OuterVolumeSpecName: "kube-api-access-qzkbj") pod "7948501f-8639-4c51-993a-c829939e6148" (UID: "7948501f-8639-4c51-993a-c829939e6148"). InnerVolumeSpecName "kube-api-access-qzkbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.740887 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7948501f-8639-4c51-993a-c829939e6148" (UID: "7948501f-8639-4c51-993a-c829939e6148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.821452 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.821491 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7948501f-8639-4c51-993a-c829939e6148-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:58 crc kubenswrapper[4823]: I1216 09:05:58.821537 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzkbj\" (UniqueName: \"kubernetes.io/projected/7948501f-8639-4c51-993a-c829939e6148-kube-api-access-qzkbj\") on node \"crc\" DevicePath \"\"" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.042097 4823 generic.go:334] "Generic (PLEG): container finished" podID="7948501f-8639-4c51-993a-c829939e6148" containerID="daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa" exitCode=0 Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.042138 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgmsv" event={"ID":"7948501f-8639-4c51-993a-c829939e6148","Type":"ContainerDied","Data":"daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa"} Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.042162 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgmsv" event={"ID":"7948501f-8639-4c51-993a-c829939e6148","Type":"ContainerDied","Data":"9a2fe3704b19686badd4662b6be48d8004412975ef0c92e47a63a1f51ec162f9"} Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.042181 4823 scope.go:117] "RemoveContainer" containerID="daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.042251 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgmsv" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.059910 4823 scope.go:117] "RemoveContainer" containerID="54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.079486 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgmsv"] Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.087221 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgmsv"] Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.091446 4823 scope.go:117] "RemoveContainer" containerID="7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.125046 4823 scope.go:117] "RemoveContainer" containerID="daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa" Dec 16 09:05:59 crc kubenswrapper[4823]: E1216 09:05:59.125436 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa\": container with ID starting with daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa not found: ID does not exist" containerID="daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.125483 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa"} err="failed to get container status \"daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa\": rpc error: code = NotFound desc = could not find container \"daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa\": container with ID starting with daa319172469456d90f79e0576618cf7956978b04f0ac6451e2b9a7c304073fa not found: ID does not exist" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.125509 4823 scope.go:117] "RemoveContainer" containerID="54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd" Dec 16 09:05:59 crc kubenswrapper[4823]: E1216 09:05:59.125810 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd\": container with ID starting with 54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd not found: ID does not exist" containerID="54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.125863 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd"} err="failed to get container status \"54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd\": rpc error: code = NotFound desc = could not find container \"54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd\": container with ID starting with 54e8d62c31b365abfccf276af2fa066c647ab0db67840682b14239b389ce81fd not found: ID does not exist" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.125897 4823 scope.go:117] "RemoveContainer" containerID="7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b" Dec 16 09:05:59 crc kubenswrapper[4823]: E1216 09:05:59.126232 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b\": container with ID starting with 7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b not found: ID does not exist" containerID="7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.126262 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b"} err="failed to get container status \"7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b\": rpc error: code = NotFound desc = could not find container \"7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b\": container with ID starting with 7ea88ea620b05667758b46f48a77742bd2c287c8cd4f6c2a60c12edadac33c0b not found: ID does not exist" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.792429 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28373e9d-544d-40d4-8517-51e6718b9493" path="/var/lib/kubelet/pods/28373e9d-544d-40d4-8517-51e6718b9493/volumes" Dec 16 09:05:59 crc kubenswrapper[4823]: I1216 09:05:59.794211 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7948501f-8639-4c51-993a-c829939e6148" path="/var/lib/kubelet/pods/7948501f-8639-4c51-993a-c829939e6148/volumes" Dec 16 09:06:00 crc kubenswrapper[4823]: I1216 09:06:00.771720 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:06:00 crc kubenswrapper[4823]: E1216 09:06:00.771987 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.912035 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm"] Dec 16 09:06:05 crc kubenswrapper[4823]: E1216 09:06:05.912890 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon-log" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.912907 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon-log" Dec 16 09:06:05 crc kubenswrapper[4823]: E1216 09:06:05.912928 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="registry-server" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.912937 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="registry-server" Dec 16 09:06:05 crc kubenswrapper[4823]: E1216 09:06:05.912950 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="extract-content" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.912960 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="extract-content" Dec 16 09:06:05 crc kubenswrapper[4823]: E1216 09:06:05.912986 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.912996 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" Dec 16 09:06:05 crc kubenswrapper[4823]: E1216 09:06:05.913010 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="extract-utilities" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.913018 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="extract-utilities" Dec 16 09:06:05 crc kubenswrapper[4823]: E1216 09:06:05.913052 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5e1025-1368-4594-88d1-16ef8ccadccb" containerName="heat-engine" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.913060 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5e1025-1368-4594-88d1-16ef8ccadccb" containerName="heat-engine" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.916464 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.916601 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7948501f-8639-4c51-993a-c829939e6148" containerName="registry-server" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.916634 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="28373e9d-544d-40d4-8517-51e6718b9493" containerName="horizon-log" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.916678 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5e1025-1368-4594-88d1-16ef8ccadccb" containerName="heat-engine" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.919711 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.921646 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.926495 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm"] Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.963646 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.963730 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24jbg\" (UniqueName: \"kubernetes.io/projected/66ba22e0-25ea-4ff8-8114-642abebbca90-kube-api-access-24jbg\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:05 crc kubenswrapper[4823]: I1216 09:06:05.964234 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.067309 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.067365 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.067388 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24jbg\" (UniqueName: \"kubernetes.io/projected/66ba22e0-25ea-4ff8-8114-642abebbca90-kube-api-access-24jbg\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.068172 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.068284 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.091988 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24jbg\" (UniqueName: \"kubernetes.io/projected/66ba22e0-25ea-4ff8-8114-642abebbca90-kube-api-access-24jbg\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.250617 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:06 crc kubenswrapper[4823]: I1216 09:06:06.692847 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm"] Dec 16 09:06:07 crc kubenswrapper[4823]: I1216 09:06:07.116347 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerStarted","Data":"49d36e85bfc0a68d4d3634211a26774d1600ad9c5f74d80f8127312a02e32103"} Dec 16 09:06:07 crc kubenswrapper[4823]: I1216 09:06:07.116402 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerStarted","Data":"577bbe3b68ab78903f58bc6029f09a32f6cfd60817fec8d43cad8dacb0f7231b"} Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.127879 4823 generic.go:334] "Generic (PLEG): container finished" podID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerID="49d36e85bfc0a68d4d3634211a26774d1600ad9c5f74d80f8127312a02e32103" exitCode=0 Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.127923 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerDied","Data":"49d36e85bfc0a68d4d3634211a26774d1600ad9c5f74d80f8127312a02e32103"} Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.882250 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2nvfp"] Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.887053 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.894636 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nvfp"] Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.921100 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbcss\" (UniqueName: \"kubernetes.io/projected/d4688585-bbc3-4739-854f-a17c034eda73-kube-api-access-kbcss\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.921199 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-catalog-content\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:08 crc kubenswrapper[4823]: I1216 09:06:08.921320 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-utilities\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.023605 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-utilities\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.023902 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbcss\" (UniqueName: \"kubernetes.io/projected/d4688585-bbc3-4739-854f-a17c034eda73-kube-api-access-kbcss\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.023971 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-catalog-content\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.024401 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-utilities\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.024785 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-catalog-content\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.045540 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbcss\" (UniqueName: \"kubernetes.io/projected/d4688585-bbc3-4739-854f-a17c034eda73-kube-api-access-kbcss\") pod \"redhat-operators-2nvfp\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.220842 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:09 crc kubenswrapper[4823]: I1216 09:06:09.576849 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nvfp"] Dec 16 09:06:10 crc kubenswrapper[4823]: I1216 09:06:10.148722 4823 generic.go:334] "Generic (PLEG): container finished" podID="d4688585-bbc3-4739-854f-a17c034eda73" containerID="b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a" exitCode=0 Dec 16 09:06:10 crc kubenswrapper[4823]: I1216 09:06:10.148776 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerDied","Data":"b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a"} Dec 16 09:06:10 crc kubenswrapper[4823]: I1216 09:06:10.148813 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerStarted","Data":"361983f487ea0140be83b6098d9f42ada81783d94c7762d44a8ad829a824faa1"} Dec 16 09:06:11 crc kubenswrapper[4823]: I1216 09:06:11.159135 4823 generic.go:334] "Generic (PLEG): container finished" podID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerID="7fc4402892c4ccec1dffc32092a3cd38c27cf715058cb39894a498c5e198d020" exitCode=0 Dec 16 09:06:11 crc kubenswrapper[4823]: I1216 09:06:11.159257 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerDied","Data":"7fc4402892c4ccec1dffc32092a3cd38c27cf715058cb39894a498c5e198d020"} Dec 16 09:06:12 crc kubenswrapper[4823]: I1216 09:06:12.217335 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerStarted","Data":"96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2"} Dec 16 09:06:12 crc kubenswrapper[4823]: I1216 09:06:12.219627 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerStarted","Data":"7acfa0e5b830db602ff01a35203a38e670f38fa18e08b54c58b765871d6465b3"} Dec 16 09:06:15 crc kubenswrapper[4823]: I1216 09:06:15.729187 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" podStartSLOduration=8.742946848999999 podStartE2EDuration="10.729169263s" podCreationTimestamp="2025-12-16 09:06:05 +0000 UTC" firstStartedPulling="2025-12-16 09:06:08.130466333 +0000 UTC m=+7846.619032466" lastFinishedPulling="2025-12-16 09:06:10.116688757 +0000 UTC m=+7848.605254880" observedRunningTime="2025-12-16 09:06:15.727114839 +0000 UTC m=+7854.215680962" watchObservedRunningTime="2025-12-16 09:06:15.729169263 +0000 UTC m=+7854.217735386" Dec 16 09:06:15 crc kubenswrapper[4823]: I1216 09:06:15.772060 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:06:15 crc kubenswrapper[4823]: E1216 09:06:15.772477 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:06:17 crc kubenswrapper[4823]: I1216 09:06:17.715348 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Liveness probe status=failure output="" start-of-body= Dec 16 09:06:17 crc kubenswrapper[4823]: I1216 09:06:17.757264 4823 patch_prober.go:28] interesting pod/downloads-7954f5f757-k2ljf container/download-server namespace/openshift-console: Readiness probe status=failure output="" start-of-body= Dec 16 09:06:19 crc kubenswrapper[4823]: I1216 09:06:19.305477 4823 generic.go:334] "Generic (PLEG): container finished" podID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerID="7acfa0e5b830db602ff01a35203a38e670f38fa18e08b54c58b765871d6465b3" exitCode=0 Dec 16 09:06:19 crc kubenswrapper[4823]: I1216 09:06:19.305905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerDied","Data":"7acfa0e5b830db602ff01a35203a38e670f38fa18e08b54c58b765871d6465b3"} Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.316929 4823 generic.go:334] "Generic (PLEG): container finished" podID="d4688585-bbc3-4739-854f-a17c034eda73" containerID="96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2" exitCode=0 Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.317210 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerDied","Data":"96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2"} Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.319300 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.635949 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.792517 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24jbg\" (UniqueName: \"kubernetes.io/projected/66ba22e0-25ea-4ff8-8114-642abebbca90-kube-api-access-24jbg\") pod \"66ba22e0-25ea-4ff8-8114-642abebbca90\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.792757 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-bundle\") pod \"66ba22e0-25ea-4ff8-8114-642abebbca90\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.792864 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-util\") pod \"66ba22e0-25ea-4ff8-8114-642abebbca90\" (UID: \"66ba22e0-25ea-4ff8-8114-642abebbca90\") " Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.794217 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-bundle" (OuterVolumeSpecName: "bundle") pod "66ba22e0-25ea-4ff8-8114-642abebbca90" (UID: "66ba22e0-25ea-4ff8-8114-642abebbca90"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.800352 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ba22e0-25ea-4ff8-8114-642abebbca90-kube-api-access-24jbg" (OuterVolumeSpecName: "kube-api-access-24jbg") pod "66ba22e0-25ea-4ff8-8114-642abebbca90" (UID: "66ba22e0-25ea-4ff8-8114-642abebbca90"). InnerVolumeSpecName "kube-api-access-24jbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.801682 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-util" (OuterVolumeSpecName: "util") pod "66ba22e0-25ea-4ff8-8114-642abebbca90" (UID: "66ba22e0-25ea-4ff8-8114-642abebbca90"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.898095 4823 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-util\") on node \"crc\" DevicePath \"\"" Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.898133 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24jbg\" (UniqueName: \"kubernetes.io/projected/66ba22e0-25ea-4ff8-8114-642abebbca90-kube-api-access-24jbg\") on node \"crc\" DevicePath \"\"" Dec 16 09:06:20 crc kubenswrapper[4823]: I1216 09:06:20.898146 4823 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66ba22e0-25ea-4ff8-8114-642abebbca90-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:06:21 crc kubenswrapper[4823]: I1216 09:06:21.327060 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" event={"ID":"66ba22e0-25ea-4ff8-8114-642abebbca90","Type":"ContainerDied","Data":"577bbe3b68ab78903f58bc6029f09a32f6cfd60817fec8d43cad8dacb0f7231b"} Dec 16 09:06:21 crc kubenswrapper[4823]: I1216 09:06:21.327353 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="577bbe3b68ab78903f58bc6029f09a32f6cfd60817fec8d43cad8dacb0f7231b" Dec 16 09:06:21 crc kubenswrapper[4823]: I1216 09:06:21.327092 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm" Dec 16 09:06:23 crc kubenswrapper[4823]: I1216 09:06:23.061731 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-82cl4"] Dec 16 09:06:23 crc kubenswrapper[4823]: I1216 09:06:23.073055 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ea34-account-create-update-l6sms"] Dec 16 09:06:23 crc kubenswrapper[4823]: I1216 09:06:23.085118 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-82cl4"] Dec 16 09:06:23 crc kubenswrapper[4823]: I1216 09:06:23.093815 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ea34-account-create-update-l6sms"] Dec 16 09:06:23 crc kubenswrapper[4823]: I1216 09:06:23.785771 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bcd2929-eefc-4b29-829b-e565910486bb" path="/var/lib/kubelet/pods/0bcd2929-eefc-4b29-829b-e565910486bb/volumes" Dec 16 09:06:23 crc kubenswrapper[4823]: I1216 09:06:23.786412 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4776592c-7509-4140-b012-6e506b95806d" path="/var/lib/kubelet/pods/4776592c-7509-4140-b012-6e506b95806d/volumes" Dec 16 09:06:28 crc kubenswrapper[4823]: I1216 09:06:28.771342 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:06:28 crc kubenswrapper[4823]: E1216 09:06:28.771872 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:06:29 crc kubenswrapper[4823]: I1216 09:06:29.550759 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerStarted","Data":"56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77"} Dec 16 09:06:30 crc kubenswrapper[4823]: I1216 09:06:30.597100 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2nvfp" podStartSLOduration=4.48949816 podStartE2EDuration="22.597077191s" podCreationTimestamp="2025-12-16 09:06:08 +0000 UTC" firstStartedPulling="2025-12-16 09:06:10.150970689 +0000 UTC m=+7848.639536812" lastFinishedPulling="2025-12-16 09:06:28.25854971 +0000 UTC m=+7866.747115843" observedRunningTime="2025-12-16 09:06:30.589522844 +0000 UTC m=+7869.078088967" watchObservedRunningTime="2025-12-16 09:06:30.597077191 +0000 UTC m=+7869.085643324" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.936342 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh"] Dec 16 09:06:34 crc kubenswrapper[4823]: E1216 09:06:34.937369 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="pull" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.937389 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="pull" Dec 16 09:06:34 crc kubenswrapper[4823]: E1216 09:06:34.937413 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="util" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.937423 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="util" Dec 16 09:06:34 crc kubenswrapper[4823]: E1216 09:06:34.937445 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="extract" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.937454 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="extract" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.937688 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ba22e0-25ea-4ff8-8114-642abebbca90" containerName="extract" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.938529 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.942189 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.942401 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-tlmnx" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.942575 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 16 09:06:34 crc kubenswrapper[4823]: I1216 09:06:34.960788 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.050792 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs9q\" (UniqueName: \"kubernetes.io/projected/07fa5706-2a14-40f7-ac5a-cc229d35055d-kube-api-access-mqs9q\") pod \"obo-prometheus-operator-668cf9dfbb-flrxh\" (UID: \"07fa5706-2a14-40f7-ac5a-cc229d35055d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.067326 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.068493 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.070887 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-4fg7b" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.072358 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.082542 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.098546 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.100199 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.155130 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/742e0cec-7370-4a35-90b8-64b2da24c464-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks\" (UID: \"742e0cec-7370-4a35-90b8-64b2da24c464\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.155190 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs9q\" (UniqueName: \"kubernetes.io/projected/07fa5706-2a14-40f7-ac5a-cc229d35055d-kube-api-access-mqs9q\") pod \"obo-prometheus-operator-668cf9dfbb-flrxh\" (UID: \"07fa5706-2a14-40f7-ac5a-cc229d35055d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.155230 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ac408b5-8185-47a6-bdd4-33cc8d6906f3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq\" (UID: \"1ac408b5-8185-47a6-bdd4-33cc8d6906f3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.155266 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/742e0cec-7370-4a35-90b8-64b2da24c464-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks\" (UID: \"742e0cec-7370-4a35-90b8-64b2da24c464\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.155321 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ac408b5-8185-47a6-bdd4-33cc8d6906f3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq\" (UID: \"1ac408b5-8185-47a6-bdd4-33cc8d6906f3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.205855 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs9q\" (UniqueName: \"kubernetes.io/projected/07fa5706-2a14-40f7-ac5a-cc229d35055d-kube-api-access-mqs9q\") pod \"obo-prometheus-operator-668cf9dfbb-flrxh\" (UID: \"07fa5706-2a14-40f7-ac5a-cc229d35055d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.254187 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.259564 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/742e0cec-7370-4a35-90b8-64b2da24c464-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks\" (UID: \"742e0cec-7370-4a35-90b8-64b2da24c464\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.259678 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ac408b5-8185-47a6-bdd4-33cc8d6906f3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq\" (UID: \"1ac408b5-8185-47a6-bdd4-33cc8d6906f3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.259834 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/742e0cec-7370-4a35-90b8-64b2da24c464-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks\" (UID: \"742e0cec-7370-4a35-90b8-64b2da24c464\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.259900 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ac408b5-8185-47a6-bdd4-33cc8d6906f3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq\" (UID: \"1ac408b5-8185-47a6-bdd4-33cc8d6906f3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.267492 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.273252 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ac408b5-8185-47a6-bdd4-33cc8d6906f3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq\" (UID: \"1ac408b5-8185-47a6-bdd4-33cc8d6906f3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.273718 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ac408b5-8185-47a6-bdd4-33cc8d6906f3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq\" (UID: \"1ac408b5-8185-47a6-bdd4-33cc8d6906f3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.274254 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/742e0cec-7370-4a35-90b8-64b2da24c464-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks\" (UID: \"742e0cec-7370-4a35-90b8-64b2da24c464\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.280554 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/742e0cec-7370-4a35-90b8-64b2da24c464-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks\" (UID: \"742e0cec-7370-4a35-90b8-64b2da24c464\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.337876 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-pkkpr"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.339471 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.343599 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-85m5q" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.343871 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.383673 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-pkkpr"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.394664 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.468294 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e554ec4f-2a7c-419b-9346-294c8026d503-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-pkkpr\" (UID: \"e554ec4f-2a7c-419b-9346-294c8026d503\") " pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.468359 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbr7\" (UniqueName: \"kubernetes.io/projected/e554ec4f-2a7c-419b-9346-294c8026d503-kube-api-access-qdbr7\") pod \"observability-operator-d8bb48f5d-pkkpr\" (UID: \"e554ec4f-2a7c-419b-9346-294c8026d503\") " pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.485128 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.573480 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e554ec4f-2a7c-419b-9346-294c8026d503-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-pkkpr\" (UID: \"e554ec4f-2a7c-419b-9346-294c8026d503\") " pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.573539 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbr7\" (UniqueName: \"kubernetes.io/projected/e554ec4f-2a7c-419b-9346-294c8026d503-kube-api-access-qdbr7\") pod \"observability-operator-d8bb48f5d-pkkpr\" (UID: \"e554ec4f-2a7c-419b-9346-294c8026d503\") " pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.578875 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e554ec4f-2a7c-419b-9346-294c8026d503-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-pkkpr\" (UID: \"e554ec4f-2a7c-419b-9346-294c8026d503\") " pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.579358 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-26zgb"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.591761 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.608366 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-9px5x" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.613039 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbr7\" (UniqueName: \"kubernetes.io/projected/e554ec4f-2a7c-419b-9346-294c8026d503-kube-api-access-qdbr7\") pod \"observability-operator-d8bb48f5d-pkkpr\" (UID: \"e554ec4f-2a7c-419b-9346-294c8026d503\") " pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.631748 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-26zgb"] Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.675728 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6kd4\" (UniqueName: \"kubernetes.io/projected/0a30c48a-65a8-4a6d-bf5d-106cb7ce567d-kube-api-access-h6kd4\") pod \"perses-operator-5446b9c989-26zgb\" (UID: \"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d\") " pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.676319 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a30c48a-65a8-4a6d-bf5d-106cb7ce567d-openshift-service-ca\") pod \"perses-operator-5446b9c989-26zgb\" (UID: \"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d\") " pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.740557 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.778062 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a30c48a-65a8-4a6d-bf5d-106cb7ce567d-openshift-service-ca\") pod \"perses-operator-5446b9c989-26zgb\" (UID: \"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d\") " pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.778172 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6kd4\" (UniqueName: \"kubernetes.io/projected/0a30c48a-65a8-4a6d-bf5d-106cb7ce567d-kube-api-access-h6kd4\") pod \"perses-operator-5446b9c989-26zgb\" (UID: \"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d\") " pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.779652 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a30c48a-65a8-4a6d-bf5d-106cb7ce567d-openshift-service-ca\") pod \"perses-operator-5446b9c989-26zgb\" (UID: \"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d\") " pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.799153 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6kd4\" (UniqueName: \"kubernetes.io/projected/0a30c48a-65a8-4a6d-bf5d-106cb7ce567d-kube-api-access-h6kd4\") pod \"perses-operator-5446b9c989-26zgb\" (UID: \"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d\") " pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:35 crc kubenswrapper[4823]: I1216 09:06:35.936765 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.552595 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-pkkpr"] Dec 16 09:06:36 crc kubenswrapper[4823]: W1216 09:06:36.580517 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod742e0cec_7370_4a35_90b8_64b2da24c464.slice/crio-121d371217163bb8e80ef70caa318565292e6a53b09ff9a1c2f826d0f68783b0 WatchSource:0}: Error finding container 121d371217163bb8e80ef70caa318565292e6a53b09ff9a1c2f826d0f68783b0: Status 404 returned error can't find the container with id 121d371217163bb8e80ef70caa318565292e6a53b09ff9a1c2f826d0f68783b0 Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.581830 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks"] Dec 16 09:06:36 crc kubenswrapper[4823]: W1216 09:06:36.589882 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac408b5_8185_47a6_bdd4_33cc8d6906f3.slice/crio-384c730b86ce5926233a5cbbc9dcffcffe1d7565ce9bb4e8a736cad9877b555d WatchSource:0}: Error finding container 384c730b86ce5926233a5cbbc9dcffcffe1d7565ce9bb4e8a736cad9877b555d: Status 404 returned error can't find the container with id 384c730b86ce5926233a5cbbc9dcffcffe1d7565ce9bb4e8a736cad9877b555d Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.602542 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq"] Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.635208 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh"] Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.666827 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" event={"ID":"07fa5706-2a14-40f7-ac5a-cc229d35055d","Type":"ContainerStarted","Data":"cf15ef02d4447439fda7761db6676c3292be873dca6b60877d957609c879bb2d"} Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.668966 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" event={"ID":"1ac408b5-8185-47a6-bdd4-33cc8d6906f3","Type":"ContainerStarted","Data":"384c730b86ce5926233a5cbbc9dcffcffe1d7565ce9bb4e8a736cad9877b555d"} Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.703560 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" event={"ID":"e554ec4f-2a7c-419b-9346-294c8026d503","Type":"ContainerStarted","Data":"e4e9f87cec359796a2b63395df8be6585f4da292a5c7a629f80b8edc00d1c1e7"} Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.712634 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" event={"ID":"742e0cec-7370-4a35-90b8-64b2da24c464","Type":"ContainerStarted","Data":"121d371217163bb8e80ef70caa318565292e6a53b09ff9a1c2f826d0f68783b0"} Dec 16 09:06:36 crc kubenswrapper[4823]: I1216 09:06:36.754093 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-26zgb"] Dec 16 09:06:36 crc kubenswrapper[4823]: W1216 09:06:36.766709 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a30c48a_65a8_4a6d_bf5d_106cb7ce567d.slice/crio-0d34bc4462c3b2761a90eb0556540d2fc72c8c1b76f911ec9e041876e8071da4 WatchSource:0}: Error finding container 0d34bc4462c3b2761a90eb0556540d2fc72c8c1b76f911ec9e041876e8071da4: Status 404 returned error can't find the container with id 0d34bc4462c3b2761a90eb0556540d2fc72c8c1b76f911ec9e041876e8071da4 Dec 16 09:06:37 crc kubenswrapper[4823]: I1216 09:06:37.832531 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-26zgb" event={"ID":"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d","Type":"ContainerStarted","Data":"0d34bc4462c3b2761a90eb0556540d2fc72c8c1b76f911ec9e041876e8071da4"} Dec 16 09:06:39 crc kubenswrapper[4823]: I1216 09:06:39.220985 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:39 crc kubenswrapper[4823]: I1216 09:06:39.221270 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:40 crc kubenswrapper[4823]: I1216 09:06:40.293957 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2nvfp" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="registry-server" probeResult="failure" output=< Dec 16 09:06:40 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 09:06:40 crc kubenswrapper[4823]: > Dec 16 09:06:40 crc kubenswrapper[4823]: I1216 09:06:40.772311 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:06:40 crc kubenswrapper[4823]: E1216 09:06:40.772993 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:06:49 crc kubenswrapper[4823]: I1216 09:06:49.337557 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:49 crc kubenswrapper[4823]: I1216 09:06:49.401176 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:50 crc kubenswrapper[4823]: I1216 09:06:50.946902 4823 scope.go:117] "RemoveContainer" containerID="4fa8fdd48414303b01bcd9b870436b73c85f7817ac4a6e64bc68d5926bad3e05" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.138235 4823 scope.go:117] "RemoveContainer" containerID="c94a186f9ff0a8616322e194208655bbfeccfa8661b4b7871a9b562424b73faa" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.276590 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2nvfp"] Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.276947 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2nvfp" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="registry-server" containerID="cri-o://56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77" gracePeriod=2 Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.754172 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.774460 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:06:51 crc kubenswrapper[4823]: E1216 09:06:51.774723 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.860311 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbcss\" (UniqueName: \"kubernetes.io/projected/d4688585-bbc3-4739-854f-a17c034eda73-kube-api-access-kbcss\") pod \"d4688585-bbc3-4739-854f-a17c034eda73\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.860480 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-utilities\") pod \"d4688585-bbc3-4739-854f-a17c034eda73\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.860601 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-catalog-content\") pod \"d4688585-bbc3-4739-854f-a17c034eda73\" (UID: \"d4688585-bbc3-4739-854f-a17c034eda73\") " Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.861316 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-utilities" (OuterVolumeSpecName: "utilities") pod "d4688585-bbc3-4739-854f-a17c034eda73" (UID: "d4688585-bbc3-4739-854f-a17c034eda73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.866759 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4688585-bbc3-4739-854f-a17c034eda73-kube-api-access-kbcss" (OuterVolumeSpecName: "kube-api-access-kbcss") pod "d4688585-bbc3-4739-854f-a17c034eda73" (UID: "d4688585-bbc3-4739-854f-a17c034eda73"). InnerVolumeSpecName "kube-api-access-kbcss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.963466 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbcss\" (UniqueName: \"kubernetes.io/projected/d4688585-bbc3-4739-854f-a17c034eda73-kube-api-access-kbcss\") on node \"crc\" DevicePath \"\"" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.963499 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:06:51 crc kubenswrapper[4823]: I1216 09:06:51.990441 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4688585-bbc3-4739-854f-a17c034eda73" (UID: "d4688585-bbc3-4739-854f-a17c034eda73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.021775 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" event={"ID":"07fa5706-2a14-40f7-ac5a-cc229d35055d","Type":"ContainerStarted","Data":"5d0548037af591a904bac45f2f67a386b70ac59736a4e50bab9394f4e6bcf308"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.024962 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" event={"ID":"1ac408b5-8185-47a6-bdd4-33cc8d6906f3","Type":"ContainerStarted","Data":"50c1bb0a0fb293f2c568d908a1ed71a96fd2524c15c546a1b3a3775cf4047e1d"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.026615 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" event={"ID":"742e0cec-7370-4a35-90b8-64b2da24c464","Type":"ContainerStarted","Data":"2473625ba9c79a6335b791e495cca397e7c5a2330fc730d135706633db7b29bb"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.028509 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-26zgb" event={"ID":"0a30c48a-65a8-4a6d-bf5d-106cb7ce567d","Type":"ContainerStarted","Data":"ddf5fb0400316c081b2ea0277e9b0124fec7113506ef0b5785c9074780c01d40"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.029201 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.037074 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" event={"ID":"e554ec4f-2a7c-419b-9346-294c8026d503","Type":"ContainerStarted","Data":"7e0943b6f50ad5b8f24df4ec4fed7326fa6adbaff4459da26e587880163c7fb9"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.038261 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.044512 4823 generic.go:334] "Generic (PLEG): container finished" podID="d4688585-bbc3-4739-854f-a17c034eda73" containerID="56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77" exitCode=0 Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.044563 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerDied","Data":"56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.044590 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nvfp" event={"ID":"d4688585-bbc3-4739-854f-a17c034eda73","Type":"ContainerDied","Data":"361983f487ea0140be83b6098d9f42ada81783d94c7762d44a8ad829a824faa1"} Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.044609 4823 scope.go:117] "RemoveContainer" containerID="56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.044741 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nvfp" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.047296 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.065410 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4688585-bbc3-4739-854f-a17c034eda73-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.068888 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-flrxh" podStartSLOduration=3.887279827 podStartE2EDuration="18.068861093s" podCreationTimestamp="2025-12-16 09:06:34 +0000 UTC" firstStartedPulling="2025-12-16 09:06:36.602718413 +0000 UTC m=+7875.091284536" lastFinishedPulling="2025-12-16 09:06:50.784299659 +0000 UTC m=+7889.272865802" observedRunningTime="2025-12-16 09:06:52.063453793 +0000 UTC m=+7890.552019926" watchObservedRunningTime="2025-12-16 09:06:52.068861093 +0000 UTC m=+7890.557427206" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.085801 4823 scope.go:117] "RemoveContainer" containerID="96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.102633 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-pkkpr" podStartSLOduration=2.8828189159999997 podStartE2EDuration="17.102607429s" podCreationTimestamp="2025-12-16 09:06:35 +0000 UTC" firstStartedPulling="2025-12-16 09:06:36.564808026 +0000 UTC m=+7875.053374159" lastFinishedPulling="2025-12-16 09:06:50.784596549 +0000 UTC m=+7889.273162672" observedRunningTime="2025-12-16 09:06:52.088775306 +0000 UTC m=+7890.577341429" watchObservedRunningTime="2025-12-16 09:06:52.102607429 +0000 UTC m=+7890.591173552" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.130234 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq" podStartSLOduration=2.94492527 podStartE2EDuration="17.130208103s" podCreationTimestamp="2025-12-16 09:06:35 +0000 UTC" firstStartedPulling="2025-12-16 09:06:36.601981929 +0000 UTC m=+7875.090548072" lastFinishedPulling="2025-12-16 09:06:50.787264782 +0000 UTC m=+7889.275830905" observedRunningTime="2025-12-16 09:06:52.112661364 +0000 UTC m=+7890.601227487" watchObservedRunningTime="2025-12-16 09:06:52.130208103 +0000 UTC m=+7890.618774226" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.168971 4823 scope.go:117] "RemoveContainer" containerID="b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.195557 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-26zgb" podStartSLOduration=3.190196167 podStartE2EDuration="17.195535198s" podCreationTimestamp="2025-12-16 09:06:35 +0000 UTC" firstStartedPulling="2025-12-16 09:06:36.778973509 +0000 UTC m=+7875.267539632" lastFinishedPulling="2025-12-16 09:06:50.78431254 +0000 UTC m=+7889.272878663" observedRunningTime="2025-12-16 09:06:52.153957656 +0000 UTC m=+7890.642523789" watchObservedRunningTime="2025-12-16 09:06:52.195535198 +0000 UTC m=+7890.684101321" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.200947 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks" podStartSLOduration=3.005536237 podStartE2EDuration="17.200932996s" podCreationTimestamp="2025-12-16 09:06:35 +0000 UTC" firstStartedPulling="2025-12-16 09:06:36.583402588 +0000 UTC m=+7875.071968711" lastFinishedPulling="2025-12-16 09:06:50.778799347 +0000 UTC m=+7889.267365470" observedRunningTime="2025-12-16 09:06:52.176838273 +0000 UTC m=+7890.665404396" watchObservedRunningTime="2025-12-16 09:06:52.200932996 +0000 UTC m=+7890.689499119" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.215635 4823 scope.go:117] "RemoveContainer" containerID="56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77" Dec 16 09:06:52 crc kubenswrapper[4823]: E1216 09:06:52.220554 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77\": container with ID starting with 56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77 not found: ID does not exist" containerID="56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.220607 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77"} err="failed to get container status \"56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77\": rpc error: code = NotFound desc = could not find container \"56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77\": container with ID starting with 56b324c199707ae0bea733f9e59c7225b9eac92816bcca8da05a54c2e6c15c77 not found: ID does not exist" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.220636 4823 scope.go:117] "RemoveContainer" containerID="96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2" Dec 16 09:06:52 crc kubenswrapper[4823]: E1216 09:06:52.223958 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2\": container with ID starting with 96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2 not found: ID does not exist" containerID="96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.224004 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2"} err="failed to get container status \"96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2\": rpc error: code = NotFound desc = could not find container \"96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2\": container with ID starting with 96fc380364993e481a4a5ea522add6b0cfa50bf3fd3e9eda8b30c8ac765b4ce2 not found: ID does not exist" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.224051 4823 scope.go:117] "RemoveContainer" containerID="b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.224097 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2nvfp"] Dec 16 09:06:52 crc kubenswrapper[4823]: E1216 09:06:52.227351 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a\": container with ID starting with b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a not found: ID does not exist" containerID="b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.227408 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a"} err="failed to get container status \"b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a\": rpc error: code = NotFound desc = could not find container \"b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a\": container with ID starting with b55baad15540130c641c83961311a83bb6a05d8e7ff4b31caf88ac39c0c1eb4a not found: ID does not exist" Dec 16 09:06:52 crc kubenswrapper[4823]: I1216 09:06:52.230302 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2nvfp"] Dec 16 09:06:53 crc kubenswrapper[4823]: I1216 09:06:53.783244 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4688585-bbc3-4739-854f-a17c034eda73" path="/var/lib/kubelet/pods/d4688585-bbc3-4739-854f-a17c034eda73/volumes" Dec 16 09:07:02 crc kubenswrapper[4823]: I1216 09:07:02.772001 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:07:02 crc kubenswrapper[4823]: E1216 09:07:02.772743 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:07:05 crc kubenswrapper[4823]: I1216 09:07:05.939761 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-26zgb" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.300936 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.302202 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="de346601-4f73-4c1f-b1ce-900f0a74e925" containerName="openstackclient" containerID="cri-o://ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c" gracePeriod=2 Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.315095 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.423296 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.424197 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="extract-content" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.424221 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="extract-content" Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.424245 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="extract-utilities" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.424261 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="extract-utilities" Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.424326 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="registry-server" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.424339 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="registry-server" Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.424383 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de346601-4f73-4c1f-b1ce-900f0a74e925" containerName="openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.424399 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="de346601-4f73-4c1f-b1ce-900f0a74e925" containerName="openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.425070 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4688585-bbc3-4739-854f-a17c034eda73" containerName="registry-server" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.425118 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="de346601-4f73-4c1f-b1ce-900f0a74e925" containerName="openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.440357 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.504585 4823 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c315d787-1569-4aa4-8329-4749e972bd7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T09:07:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T09:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T09:07:08Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T09:07:08Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.rdoproject.org/podified-antelope-centos9/openstack-openstackclient:c3a837a7c939c44c9106d2b2c7c72015\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr94t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T09:07:08Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.507377 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.526387 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="de346601-4f73-4c1f-b1ce-900f0a74e925" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.530776 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr94t\" (UniqueName: \"kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.530954 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.531005 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.531094 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.545108 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.546054 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-zr94t openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="c315d787-1569-4aa4-8329-4749e972bd7b" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.571765 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.588003 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.589915 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.600870 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c315d787-1569-4aa4-8329-4749e972bd7b" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.616568 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.641430 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.642005 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.642204 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.642369 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr94t\" (UniqueName: \"kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.648534 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.651856 4823 projected.go:194] Error preparing data for projected volume kube-api-access-zr94t for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c315d787-1569-4aa4-8329-4749e972bd7b) does not match the UID in record. The object might have been deleted and then recreated Dec 16 09:07:08 crc kubenswrapper[4823]: E1216 09:07:08.651975 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t podName:c315d787-1569-4aa4-8329-4749e972bd7b nodeName:}" failed. No retries permitted until 2025-12-16 09:07:09.151934829 +0000 UTC m=+7907.640500952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zr94t" (UniqueName: "kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t") pod "openstackclient" (UID: "c315d787-1569-4aa4-8329-4749e972bd7b") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c315d787-1569-4aa4-8329-4749e972bd7b) does not match the UID in record. The object might have been deleted and then recreated Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.654953 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.666116 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.695510 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.697281 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.705084 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-69spg" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.708389 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.745940 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config-secret\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.745987 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.746093 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bdt8\" (UniqueName: \"kubernetes.io/projected/e602b9c3-9c85-4461-9534-e76d4ad4e929-kube-api-access-2bdt8\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.746145 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.847935 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bdt8\" (UniqueName: \"kubernetes.io/projected/e602b9c3-9c85-4461-9534-e76d4ad4e929-kube-api-access-2bdt8\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.848610 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plm2\" (UniqueName: \"kubernetes.io/projected/931c8fa8-3d33-42d2-a505-9320bd5d3695-kube-api-access-8plm2\") pod \"kube-state-metrics-0\" (UID: \"931c8fa8-3d33-42d2-a505-9320bd5d3695\") " pod="openstack/kube-state-metrics-0" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.848777 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.848982 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config-secret\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.849114 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.853057 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.860742 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config-secret\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.861598 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.879775 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bdt8\" (UniqueName: \"kubernetes.io/projected/e602b9c3-9c85-4461-9534-e76d4ad4e929-kube-api-access-2bdt8\") pod \"openstackclient\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.927878 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:08 crc kubenswrapper[4823]: I1216 09:07:08.951275 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plm2\" (UniqueName: \"kubernetes.io/projected/931c8fa8-3d33-42d2-a505-9320bd5d3695-kube-api-access-8plm2\") pod \"kube-state-metrics-0\" (UID: \"931c8fa8-3d33-42d2-a505-9320bd5d3695\") " pod="openstack/kube-state-metrics-0" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:08.998180 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plm2\" (UniqueName: \"kubernetes.io/projected/931c8fa8-3d33-42d2-a505-9320bd5d3695-kube-api-access-8plm2\") pod \"kube-state-metrics-0\" (UID: \"931c8fa8-3d33-42d2-a505-9320bd5d3695\") " pod="openstack/kube-state-metrics-0" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.074898 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.156709 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr94t\" (UniqueName: \"kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t\") pod \"openstackclient\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " pod="openstack/openstackclient" Dec 16 09:07:09 crc kubenswrapper[4823]: E1216 09:07:09.164969 4823 projected.go:194] Error preparing data for projected volume kube-api-access-zr94t for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c315d787-1569-4aa4-8329-4749e972bd7b) does not match the UID in record. The object might have been deleted and then recreated Dec 16 09:07:09 crc kubenswrapper[4823]: E1216 09:07:09.165064 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t podName:c315d787-1569-4aa4-8329-4749e972bd7b nodeName:}" failed. No retries permitted until 2025-12-16 09:07:10.165045258 +0000 UTC m=+7908.653611381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zr94t" (UniqueName: "kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t") pod "openstackclient" (UID: "c315d787-1569-4aa4-8329-4749e972bd7b") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c315d787-1569-4aa4-8329-4749e972bd7b) does not match the UID in record. The object might have been deleted and then recreated Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.226748 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.242637 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c315d787-1569-4aa4-8329-4749e972bd7b" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.246306 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.292574 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c315d787-1569-4aa4-8329-4749e972bd7b" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.367727 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config-secret\") pod \"c315d787-1569-4aa4-8329-4749e972bd7b\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.367862 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config\") pod \"c315d787-1569-4aa4-8329-4749e972bd7b\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.367962 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-combined-ca-bundle\") pod \"c315d787-1569-4aa4-8329-4749e972bd7b\" (UID: \"c315d787-1569-4aa4-8329-4749e972bd7b\") " Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.368709 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr94t\" (UniqueName: \"kubernetes.io/projected/c315d787-1569-4aa4-8329-4749e972bd7b-kube-api-access-zr94t\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.369963 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c315d787-1569-4aa4-8329-4749e972bd7b" (UID: "c315d787-1569-4aa4-8329-4749e972bd7b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.411696 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c315d787-1569-4aa4-8329-4749e972bd7b" (UID: "c315d787-1569-4aa4-8329-4749e972bd7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.412576 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c315d787-1569-4aa4-8329-4749e972bd7b" (UID: "c315d787-1569-4aa4-8329-4749e972bd7b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.470569 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.470632 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c315d787-1569-4aa4-8329-4749e972bd7b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.470646 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c315d787-1569-4aa4-8329-4749e972bd7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.795241 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" probeResult="failure" output="command timed out" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.813407 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c315d787-1569-4aa4-8329-4749e972bd7b" path="/var/lib/kubelet/pods/c315d787-1569-4aa4-8329-4749e972bd7b/volumes" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.814606 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" probeResult="failure" output="command timed out" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.817722 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.835932 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.836072 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.848819 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-z6s8h" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.849010 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.849135 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.849236 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 16 09:07:09 crc kubenswrapper[4823]: I1216 09:07:09.849342 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.018755 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzsw\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-kube-api-access-fqzsw\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.019138 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.019184 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.019218 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.019281 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.019310 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.019353 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.067534 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.078303 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.083196 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.083493 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.083672 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.083846 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.089449 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.089840 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-bwmvq" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.096850 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121287 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121359 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121423 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121475 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzsw\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-kube-api-access-fqzsw\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121553 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121590 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.121614 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.125394 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.127496 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.128431 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.130558 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.136171 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.145666 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.172972 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzsw\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-kube-api-access-fqzsw\") pod \"alertmanager-metric-storage-0\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.176735 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.212278 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224450 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e99e87c-fc98-42e4-86db-045e839bc56c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224502 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224526 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e99e87c-fc98-42e4-86db-045e839bc56c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224577 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224609 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224634 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4btc5\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-kube-api-access-4btc5\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224682 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.224842 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.248170 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.249354 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e602b9c3-9c85-4461-9534-e76d4ad4e929","Type":"ContainerStarted","Data":"4ed667e9c58559123e0508ad1464ec788c6a92a6f93a3ee2e0e3ff2c733fdbf9"} Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.256333 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c315d787-1569-4aa4-8329-4749e972bd7b" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:10 crc kubenswrapper[4823]: W1216 09:07:10.304619 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod931c8fa8_3d33_42d2_a505_9320bd5d3695.slice/crio-1989210c092526f4d739b81ccc4b78a0d5216e98e7da4f328f13316f72f8b80a WatchSource:0}: Error finding container 1989210c092526f4d739b81ccc4b78a0d5216e98e7da4f328f13316f72f8b80a: Status 404 returned error can't find the container with id 1989210c092526f4d739b81ccc4b78a0d5216e98e7da4f328f13316f72f8b80a Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.325244 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c315d787-1569-4aa4-8329-4749e972bd7b" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326403 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326483 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326572 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e99e87c-fc98-42e4-86db-045e839bc56c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326594 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e99e87c-fc98-42e4-86db-045e839bc56c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326612 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326644 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326667 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.326688 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4btc5\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-kube-api-access-4btc5\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.328828 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e99e87c-fc98-42e4-86db-045e839bc56c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.329648 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.344582 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.344638 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf701248a9eb9649176716528363fdeb229d40b796f898005d6f1efdc6d91879/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.345414 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e99e87c-fc98-42e4-86db-045e839bc56c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.345749 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.347049 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4btc5\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-kube-api-access-4btc5\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.348285 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.358655 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.389725 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.443588 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.701327 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.872474 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.950868 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-combined-ca-bundle\") pod \"de346601-4f73-4c1f-b1ce-900f0a74e925\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.951148 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config-secret\") pod \"de346601-4f73-4c1f-b1ce-900f0a74e925\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.951206 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbz67\" (UniqueName: \"kubernetes.io/projected/de346601-4f73-4c1f-b1ce-900f0a74e925-kube-api-access-dbz67\") pod \"de346601-4f73-4c1f-b1ce-900f0a74e925\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.951247 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config\") pod \"de346601-4f73-4c1f-b1ce-900f0a74e925\" (UID: \"de346601-4f73-4c1f-b1ce-900f0a74e925\") " Dec 16 09:07:10 crc kubenswrapper[4823]: I1216 09:07:10.966392 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de346601-4f73-4c1f-b1ce-900f0a74e925-kube-api-access-dbz67" (OuterVolumeSpecName: "kube-api-access-dbz67") pod "de346601-4f73-4c1f-b1ce-900f0a74e925" (UID: "de346601-4f73-4c1f-b1ce-900f0a74e925"). InnerVolumeSpecName "kube-api-access-dbz67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.023572 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de346601-4f73-4c1f-b1ce-900f0a74e925" (UID: "de346601-4f73-4c1f-b1ce-900f0a74e925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.032043 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "de346601-4f73-4c1f-b1ce-900f0a74e925" (UID: "de346601-4f73-4c1f-b1ce-900f0a74e925"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.060419 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbz67\" (UniqueName: \"kubernetes.io/projected/de346601-4f73-4c1f-b1ce-900f0a74e925-kube-api-access-dbz67\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.060472 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.060489 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.069379 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.098230 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "de346601-4f73-4c1f-b1ce-900f0a74e925" (UID: "de346601-4f73-4c1f-b1ce-900f0a74e925"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.098307 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fjw2z"] Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.115057 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fjw2z"] Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.162219 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de346601-4f73-4c1f-b1ce-900f0a74e925-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.260788 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"931c8fa8-3d33-42d2-a505-9320bd5d3695","Type":"ContainerStarted","Data":"1989210c092526f4d739b81ccc4b78a0d5216e98e7da4f328f13316f72f8b80a"} Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.264280 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e602b9c3-9c85-4461-9534-e76d4ad4e929","Type":"ContainerStarted","Data":"ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09"} Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.266463 4823 generic.go:334] "Generic (PLEG): container finished" podID="de346601-4f73-4c1f-b1ce-900f0a74e925" containerID="ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c" exitCode=137 Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.266559 4823 scope.go:117] "RemoveContainer" containerID="ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.266596 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.268536 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerStarted","Data":"39a463dafb4c1ac5e51aa691068f37f8448d522c184cdf99e0ef84345df7c8f0"} Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.286350 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.286327939 podStartE2EDuration="3.286327939s" podCreationTimestamp="2025-12-16 09:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:07:11.281458826 +0000 UTC m=+7909.770024949" watchObservedRunningTime="2025-12-16 09:07:11.286327939 +0000 UTC m=+7909.774894062" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.292637 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="de346601-4f73-4c1f-b1ce-900f0a74e925" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.350386 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.437302 4823 scope.go:117] "RemoveContainer" containerID="ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c" Dec 16 09:07:11 crc kubenswrapper[4823]: E1216 09:07:11.438136 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c\": container with ID starting with ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c not found: ID does not exist" containerID="ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.438200 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c"} err="failed to get container status \"ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c\": rpc error: code = NotFound desc = could not find container \"ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c\": container with ID starting with ba48e7ea9424f0cf0ee60830d8ced7eb24b70b2b28ac68f93e9f399544fc598c not found: ID does not exist" Dec 16 09:07:11 crc kubenswrapper[4823]: W1216 09:07:11.443426 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e99e87c_fc98_42e4_86db_045e839bc56c.slice/crio-9c721b48f432562b88561f89175d61da1b499e43db0edc4dfce130fe93f07025 WatchSource:0}: Error finding container 9c721b48f432562b88561f89175d61da1b499e43db0edc4dfce130fe93f07025: Status 404 returned error can't find the container with id 9c721b48f432562b88561f89175d61da1b499e43db0edc4dfce130fe93f07025 Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.787188 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a89ba3-ef70-4a41-b6d4-47d8575ccbbb" path="/var/lib/kubelet/pods/07a89ba3-ef70-4a41-b6d4-47d8575ccbbb/volumes" Dec 16 09:07:11 crc kubenswrapper[4823]: I1216 09:07:11.788286 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de346601-4f73-4c1f-b1ce-900f0a74e925" path="/var/lib/kubelet/pods/de346601-4f73-4c1f-b1ce-900f0a74e925/volumes" Dec 16 09:07:12 crc kubenswrapper[4823]: I1216 09:07:12.282345 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerStarted","Data":"9c721b48f432562b88561f89175d61da1b499e43db0edc4dfce130fe93f07025"} Dec 16 09:07:12 crc kubenswrapper[4823]: I1216 09:07:12.283677 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"931c8fa8-3d33-42d2-a505-9320bd5d3695","Type":"ContainerStarted","Data":"854537105ff5e271e8f51a7f717ce4928b3958183fae6adeab4fb337061e9d37"} Dec 16 09:07:12 crc kubenswrapper[4823]: I1216 09:07:12.310523 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.105827429 podStartE2EDuration="4.310496582s" podCreationTimestamp="2025-12-16 09:07:08 +0000 UTC" firstStartedPulling="2025-12-16 09:07:10.330201584 +0000 UTC m=+7908.818767707" lastFinishedPulling="2025-12-16 09:07:11.534870737 +0000 UTC m=+7910.023436860" observedRunningTime="2025-12-16 09:07:12.304936858 +0000 UTC m=+7910.793502981" watchObservedRunningTime="2025-12-16 09:07:12.310496582 +0000 UTC m=+7910.799062705" Dec 16 09:07:13 crc kubenswrapper[4823]: I1216 09:07:13.296792 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 09:07:16 crc kubenswrapper[4823]: I1216 09:07:16.771849 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:07:16 crc kubenswrapper[4823]: E1216 09:07:16.772531 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:07:17 crc kubenswrapper[4823]: I1216 09:07:17.336930 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerStarted","Data":"6841fcbf9dc6db729a33a5b8eec379954b362ffe6e98be98570e1c486a7e1947"} Dec 16 09:07:17 crc kubenswrapper[4823]: I1216 09:07:17.339754 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerStarted","Data":"3c5539c8d3d73e21f17c90c61aa208171bb12cef7e68027753b1974b213655b6"} Dec 16 09:07:19 crc kubenswrapper[4823]: I1216 09:07:19.082013 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 09:07:23 crc kubenswrapper[4823]: I1216 09:07:23.420338 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerID="3c5539c8d3d73e21f17c90c61aa208171bb12cef7e68027753b1974b213655b6" exitCode=0 Dec 16 09:07:23 crc kubenswrapper[4823]: I1216 09:07:23.421743 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerDied","Data":"3c5539c8d3d73e21f17c90c61aa208171bb12cef7e68027753b1974b213655b6"} Dec 16 09:07:24 crc kubenswrapper[4823]: I1216 09:07:24.431062 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerID="6841fcbf9dc6db729a33a5b8eec379954b362ffe6e98be98570e1c486a7e1947" exitCode=0 Dec 16 09:07:24 crc kubenswrapper[4823]: I1216 09:07:24.431184 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerDied","Data":"6841fcbf9dc6db729a33a5b8eec379954b362ffe6e98be98570e1c486a7e1947"} Dec 16 09:07:30 crc kubenswrapper[4823]: I1216 09:07:30.771858 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:07:30 crc kubenswrapper[4823]: E1216 09:07:30.772542 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:07:31 crc kubenswrapper[4823]: I1216 09:07:31.509601 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerStarted","Data":"6dcfd48a9401ea537616a16a619f9cf8397493228a632419e0b26b39624f0619"} Dec 16 09:07:31 crc kubenswrapper[4823]: I1216 09:07:31.513903 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerStarted","Data":"a631e0e9e3de34c173d5764af319e268563242630a3966480f70655a3f9520fc"} Dec 16 09:07:36 crc kubenswrapper[4823]: I1216 09:07:36.568800 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerStarted","Data":"6290fa9dd47ddcbe6af848dea0cd674ac46958cb32b97bcc1992c523f7b7398e"} Dec 16 09:07:36 crc kubenswrapper[4823]: I1216 09:07:36.572998 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerStarted","Data":"96fcd04f838b04b2f8450fa0fddd75849e66aca34b0de130673596b80ab9b02e"} Dec 16 09:07:36 crc kubenswrapper[4823]: I1216 09:07:36.573352 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:36 crc kubenswrapper[4823]: I1216 09:07:36.576422 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 16 09:07:36 crc kubenswrapper[4823]: I1216 09:07:36.623679 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.09511321 podStartE2EDuration="27.623646231s" podCreationTimestamp="2025-12-16 09:07:09 +0000 UTC" firstStartedPulling="2025-12-16 09:07:11.038317237 +0000 UTC m=+7909.526883360" lastFinishedPulling="2025-12-16 09:07:30.566850248 +0000 UTC m=+7929.055416381" observedRunningTime="2025-12-16 09:07:36.597018197 +0000 UTC m=+7935.085584360" watchObservedRunningTime="2025-12-16 09:07:36.623646231 +0000 UTC m=+7935.112212364" Dec 16 09:07:40 crc kubenswrapper[4823]: I1216 09:07:40.620864 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerStarted","Data":"ec6d9d169edfc2eef0447928d7201ecbc01c12bebdcc79558d83ae75bddc641d"} Dec 16 09:07:40 crc kubenswrapper[4823]: I1216 09:07:40.650077 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.7340450579999995 podStartE2EDuration="32.650054856s" podCreationTimestamp="2025-12-16 09:07:08 +0000 UTC" firstStartedPulling="2025-12-16 09:07:11.533619748 +0000 UTC m=+7910.022185871" lastFinishedPulling="2025-12-16 09:07:39.449629546 +0000 UTC m=+7937.938195669" observedRunningTime="2025-12-16 09:07:40.644536823 +0000 UTC m=+7939.133102946" watchObservedRunningTime="2025-12-16 09:07:40.650054856 +0000 UTC m=+7939.138620979" Dec 16 09:07:40 crc kubenswrapper[4823]: I1216 09:07:40.702403 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:40 crc kubenswrapper[4823]: I1216 09:07:40.702448 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:40 crc kubenswrapper[4823]: I1216 09:07:40.705044 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:41 crc kubenswrapper[4823]: I1216 09:07:41.633754 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.073474 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.074140 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" containerName="openstackclient" containerID="cri-o://ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09" gracePeriod=2 Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.084227 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.107583 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:43 crc kubenswrapper[4823]: E1216 09:07:43.108322 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" containerName="openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.108357 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" containerName="openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.108708 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" containerName="openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.109897 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.114343 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.125247 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.204997 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.205351 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config-secret\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.205493 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpfd9\" (UniqueName: \"kubernetes.io/projected/6aac7bd9-5925-4c54-b747-57320a350ab9-kube-api-access-fpfd9\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.205569 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.307346 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config-secret\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.307612 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpfd9\" (UniqueName: \"kubernetes.io/projected/6aac7bd9-5925-4c54-b747-57320a350ab9-kube-api-access-fpfd9\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.307729 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.307861 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.308513 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.313789 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config-secret\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.326683 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.326832 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpfd9\" (UniqueName: \"kubernetes.io/projected/6aac7bd9-5925-4c54-b747-57320a350ab9-kube-api-access-fpfd9\") pod \"openstackclient\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.438737 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:43 crc kubenswrapper[4823]: I1216 09:07:43.952061 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 09:07:43 crc kubenswrapper[4823]: W1216 09:07:43.953859 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aac7bd9_5925_4c54_b747_57320a350ab9.slice/crio-1bf8857f0b81a91b3ab8f5cdca2203c46b574d91288afd07acbe33e9665a280d WatchSource:0}: Error finding container 1bf8857f0b81a91b3ab8f5cdca2203c46b574d91288afd07acbe33e9665a280d: Status 404 returned error can't find the container with id 1bf8857f0b81a91b3ab8f5cdca2203c46b574d91288afd07acbe33e9665a280d Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.661872 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6aac7bd9-5925-4c54-b747-57320a350ab9","Type":"ContainerStarted","Data":"06a0d06ef2a52866e48ef5baf829653c9eb89594b03bea4aa951dea2c39e0a1c"} Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.662203 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6aac7bd9-5925-4c54-b747-57320a350ab9","Type":"ContainerStarted","Data":"1bf8857f0b81a91b3ab8f5cdca2203c46b574d91288afd07acbe33e9665a280d"} Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.696372 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.6963376540000001 podStartE2EDuration="1.696337654s" podCreationTimestamp="2025-12-16 09:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:07:44.678770924 +0000 UTC m=+7943.167337057" watchObservedRunningTime="2025-12-16 09:07:44.696337654 +0000 UTC m=+7943.184903777" Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.751814 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.752095 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="prometheus" containerID="cri-o://a631e0e9e3de34c173d5764af319e268563242630a3966480f70655a3f9520fc" gracePeriod=600 Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.752536 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="thanos-sidecar" containerID="cri-o://ec6d9d169edfc2eef0447928d7201ecbc01c12bebdcc79558d83ae75bddc641d" gracePeriod=600 Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.752583 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="config-reloader" containerID="cri-o://6290fa9dd47ddcbe6af848dea0cd674ac46958cb32b97bcc1992c523f7b7398e" gracePeriod=600 Dec 16 09:07:44 crc kubenswrapper[4823]: I1216 09:07:44.773043 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:07:44 crc kubenswrapper[4823]: E1216 09:07:44.776218 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.297236 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.353777 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config-secret\") pod \"e602b9c3-9c85-4461-9534-e76d4ad4e929\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.353863 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bdt8\" (UniqueName: \"kubernetes.io/projected/e602b9c3-9c85-4461-9534-e76d4ad4e929-kube-api-access-2bdt8\") pod \"e602b9c3-9c85-4461-9534-e76d4ad4e929\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.353904 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config\") pod \"e602b9c3-9c85-4461-9534-e76d4ad4e929\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.353959 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-combined-ca-bundle\") pod \"e602b9c3-9c85-4461-9534-e76d4ad4e929\" (UID: \"e602b9c3-9c85-4461-9534-e76d4ad4e929\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.361301 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e602b9c3-9c85-4461-9534-e76d4ad4e929-kube-api-access-2bdt8" (OuterVolumeSpecName: "kube-api-access-2bdt8") pod "e602b9c3-9c85-4461-9534-e76d4ad4e929" (UID: "e602b9c3-9c85-4461-9534-e76d4ad4e929"). InnerVolumeSpecName "kube-api-access-2bdt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.390939 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e602b9c3-9c85-4461-9534-e76d4ad4e929" (UID: "e602b9c3-9c85-4461-9534-e76d4ad4e929"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.392135 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e602b9c3-9c85-4461-9534-e76d4ad4e929" (UID: "e602b9c3-9c85-4461-9534-e76d4ad4e929"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.414779 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e602b9c3-9c85-4461-9534-e76d4ad4e929" (UID: "e602b9c3-9c85-4461-9534-e76d4ad4e929"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.456284 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.456320 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bdt8\" (UniqueName: \"kubernetes.io/projected/e602b9c3-9c85-4461-9534-e76d4ad4e929-kube-api-access-2bdt8\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.456334 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e602b9c3-9c85-4461-9534-e76d4ad4e929-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.456344 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e602b9c3-9c85-4461-9534-e76d4ad4e929-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.687218 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerID="ec6d9d169edfc2eef0447928d7201ecbc01c12bebdcc79558d83ae75bddc641d" exitCode=0 Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.687600 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerID="6290fa9dd47ddcbe6af848dea0cd674ac46958cb32b97bcc1992c523f7b7398e" exitCode=0 Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.687638 4823 generic.go:334] "Generic (PLEG): container finished" podID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerID="a631e0e9e3de34c173d5764af319e268563242630a3966480f70655a3f9520fc" exitCode=0 Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.687737 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerDied","Data":"ec6d9d169edfc2eef0447928d7201ecbc01c12bebdcc79558d83ae75bddc641d"} Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.687763 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerDied","Data":"6290fa9dd47ddcbe6af848dea0cd674ac46958cb32b97bcc1992c523f7b7398e"} Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.687794 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerDied","Data":"a631e0e9e3de34c173d5764af319e268563242630a3966480f70655a3f9520fc"} Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.692574 4823 generic.go:334] "Generic (PLEG): container finished" podID="e602b9c3-9c85-4461-9534-e76d4ad4e929" containerID="ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09" exitCode=137 Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.692612 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.692701 4823 scope.go:117] "RemoveContainer" containerID="ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.714882 4823 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.721673 4823 scope.go:117] "RemoveContainer" containerID="ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09" Dec 16 09:07:45 crc kubenswrapper[4823]: E1216 09:07:45.722293 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09\": container with ID starting with ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09 not found: ID does not exist" containerID="ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.722326 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09"} err="failed to get container status \"ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09\": rpc error: code = NotFound desc = could not find container \"ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09\": container with ID starting with ef1bad3dffac77432fb7aadbd412fd8f615c484cc6427eb2140c76aac2f3ab09 not found: ID does not exist" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.791477 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e602b9c3-9c85-4461-9534-e76d4ad4e929" path="/var/lib/kubelet/pods/e602b9c3-9c85-4461-9534-e76d4ad4e929/volumes" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.838798 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.876418 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-web-config\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.876565 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-thanos-prometheus-http-client-file\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.876666 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e99e87c-fc98-42e4-86db-045e839bc56c-prometheus-metric-storage-rulefiles-0\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.876822 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4btc5\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-kube-api-access-4btc5\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.876965 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-tls-assets\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.877066 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e99e87c-fc98-42e4-86db-045e839bc56c-config-out\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.877146 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-config\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.877326 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"5e99e87c-fc98-42e4-86db-045e839bc56c\" (UID: \"5e99e87c-fc98-42e4-86db-045e839bc56c\") " Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.885304 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-kube-api-access-4btc5" (OuterVolumeSpecName: "kube-api-access-4btc5") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "kube-api-access-4btc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.885376 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.885765 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e99e87c-fc98-42e4-86db-045e839bc56c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.888736 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-config" (OuterVolumeSpecName: "config") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.889273 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.891526 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e99e87c-fc98-42e4-86db-045e839bc56c-config-out" (OuterVolumeSpecName: "config-out") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.914948 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-web-config" (OuterVolumeSpecName: "web-config") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.915171 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "5e99e87c-fc98-42e4-86db-045e839bc56c" (UID: "5e99e87c-fc98-42e4-86db-045e839bc56c"). InnerVolumeSpecName "pvc-b03e7a37-2b40-4cde-917a-642392a8adbd". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980291 4823 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-web-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980330 4823 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980340 4823 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e99e87c-fc98-42e4-86db-045e839bc56c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980355 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4btc5\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-kube-api-access-4btc5\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980367 4823 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e99e87c-fc98-42e4-86db-045e839bc56c-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980374 4823 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e99e87c-fc98-42e4-86db-045e839bc56c-config-out\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980384 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e99e87c-fc98-42e4-86db-045e839bc56c-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:45 crc kubenswrapper[4823]: I1216 09:07:45.980450 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") on node \"crc\" " Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.002969 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.003155 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b03e7a37-2b40-4cde-917a-642392a8adbd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd") on node "crc" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.083141 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.705434 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5e99e87c-fc98-42e4-86db-045e839bc56c","Type":"ContainerDied","Data":"9c721b48f432562b88561f89175d61da1b499e43db0edc4dfce130fe93f07025"} Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.705799 4823 scope.go:117] "RemoveContainer" containerID="ec6d9d169edfc2eef0447928d7201ecbc01c12bebdcc79558d83ae75bddc641d" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.705465 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.735479 4823 scope.go:117] "RemoveContainer" containerID="6290fa9dd47ddcbe6af848dea0cd674ac46958cb32b97bcc1992c523f7b7398e" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.746449 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.759767 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.767921 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:46 crc kubenswrapper[4823]: E1216 09:07:46.768325 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="config-reloader" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768340 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="config-reloader" Dec 16 09:07:46 crc kubenswrapper[4823]: E1216 09:07:46.768358 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="init-config-reloader" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768364 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="init-config-reloader" Dec 16 09:07:46 crc kubenswrapper[4823]: E1216 09:07:46.768377 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="thanos-sidecar" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768383 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="thanos-sidecar" Dec 16 09:07:46 crc kubenswrapper[4823]: E1216 09:07:46.768403 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="prometheus" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768408 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="prometheus" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768580 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="prometheus" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768595 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="config-reloader" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.768610 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="thanos-sidecar" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.770372 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.773832 4823 scope.go:117] "RemoveContainer" containerID="a631e0e9e3de34c173d5764af319e268563242630a3966480f70655a3f9520fc" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.774202 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-bwmvq" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.774428 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.774553 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.774599 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.774749 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.775207 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.788048 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.803050 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.817297 4823 scope.go:117] "RemoveContainer" containerID="3c5539c8d3d73e21f17c90c61aa208171bb12cef7e68027753b1974b213655b6" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902089 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902139 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902196 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902258 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902390 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902427 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902454 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902531 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902586 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f058bf18-c31d-4b48-a183-bb9ae9223fbe-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902658 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v5zd\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-kube-api-access-4v5zd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:46 crc kubenswrapper[4823]: I1216 09:07:46.902687 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.003834 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.003881 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.003905 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.003946 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.003976 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f058bf18-c31d-4b48-a183-bb9ae9223fbe-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.004047 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v5zd\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-kube-api-access-4v5zd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.004075 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.004110 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.004136 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.004177 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.004209 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.005447 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f058bf18-c31d-4b48-a183-bb9ae9223fbe-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.010137 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.010720 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.012241 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.012530 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.013208 4823 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.013239 4823 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf701248a9eb9649176716528363fdeb229d40b796f898005d6f1efdc6d91879/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.017268 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.017733 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.018685 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.019869 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.021534 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v5zd\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-kube-api-access-4v5zd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.064738 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"prometheus-metric-storage-0\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.101678 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:07:47 crc kubenswrapper[4823]: W1216 09:07:47.702371 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf058bf18_c31d_4b48_a183_bb9ae9223fbe.slice/crio-04fbed9140fb34f53a53f73139b7550de89510410eb5bf68ae0e0477bd549fb9 WatchSource:0}: Error finding container 04fbed9140fb34f53a53f73139b7550de89510410eb5bf68ae0e0477bd549fb9: Status 404 returned error can't find the container with id 04fbed9140fb34f53a53f73139b7550de89510410eb5bf68ae0e0477bd549fb9 Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.707306 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.759150 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerStarted","Data":"04fbed9140fb34f53a53f73139b7550de89510410eb5bf68ae0e0477bd549fb9"} Dec 16 09:07:47 crc kubenswrapper[4823]: I1216 09:07:47.785218 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" path="/var/lib/kubelet/pods/5e99e87c-fc98-42e4-86db-045e839bc56c/volumes" Dec 16 09:07:48 crc kubenswrapper[4823]: I1216 09:07:48.702802 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="5e99e87c-fc98-42e4-86db-045e839bc56c" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.135:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.660479 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.663392 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.665709 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.668458 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.676857 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.693863 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-scripts\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.693949 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.693983 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-log-httpd\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.694039 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.694162 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-config-data\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.694204 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7fz\" (UniqueName: \"kubernetes.io/projected/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-kube-api-access-jb7fz\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.694310 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-run-httpd\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.796654 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-run-httpd\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.796736 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-scripts\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.796800 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.796826 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-log-httpd\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.796858 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.796969 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-config-data\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.797006 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7fz\" (UniqueName: \"kubernetes.io/projected/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-kube-api-access-jb7fz\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.797871 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-run-httpd\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.798890 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-log-httpd\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.803467 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-config-data\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.804241 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-scripts\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.804570 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.805329 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.818386 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7fz\" (UniqueName: \"kubernetes.io/projected/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-kube-api-access-jb7fz\") pod \"ceilometer-0\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " pod="openstack/ceilometer-0" Dec 16 09:07:50 crc kubenswrapper[4823]: I1216 09:07:50.999824 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:07:51 crc kubenswrapper[4823]: I1216 09:07:51.299498 4823 scope.go:117] "RemoveContainer" containerID="3059c456084fc689951e3eb5aee2dfc53406f37d52d93a1078de1666042d7578" Dec 16 09:07:51 crc kubenswrapper[4823]: I1216 09:07:51.328640 4823 scope.go:117] "RemoveContainer" containerID="e700452af7238afdeafb9a420ed28e550e0314d8d8bb24657ea8760e0a2e188a" Dec 16 09:07:51 crc kubenswrapper[4823]: I1216 09:07:51.366548 4823 scope.go:117] "RemoveContainer" containerID="76cfd7427b391b74b49e284f41e6deeef50c653e65e3743f36a051a6d001ab71" Dec 16 09:07:51 crc kubenswrapper[4823]: I1216 09:07:51.469417 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:07:51 crc kubenswrapper[4823]: W1216 09:07:51.476733 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d2bd26_e3e1_42e8_b946_8dac79b49e51.slice/crio-fc5ab1515683761623d3c563f76764e1c437d98554edb65cabcf24ff58e05ea4 WatchSource:0}: Error finding container fc5ab1515683761623d3c563f76764e1c437d98554edb65cabcf24ff58e05ea4: Status 404 returned error can't find the container with id fc5ab1515683761623d3c563f76764e1c437d98554edb65cabcf24ff58e05ea4 Dec 16 09:07:51 crc kubenswrapper[4823]: I1216 09:07:51.803692 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerStarted","Data":"f36a353c66bec2426a6543a7c9be7673d22af695566b4903267ba28b9ec1e76a"} Dec 16 09:07:51 crc kubenswrapper[4823]: I1216 09:07:51.808135 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerStarted","Data":"fc5ab1515683761623d3c563f76764e1c437d98554edb65cabcf24ff58e05ea4"} Dec 16 09:07:55 crc kubenswrapper[4823]: I1216 09:07:55.771830 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:07:55 crc kubenswrapper[4823]: E1216 09:07:55.772677 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:07:56 crc kubenswrapper[4823]: I1216 09:07:56.866477 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerStarted","Data":"e4d3357dadc5bf821fe74af974375318994589e62a109936cb435cd8dc3db23d"} Dec 16 09:07:57 crc kubenswrapper[4823]: I1216 09:07:57.883466 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerStarted","Data":"08400715c044e89aa8efc8a75cc7c1f9be395ad5c84aeac9db3852913f85966a"} Dec 16 09:07:58 crc kubenswrapper[4823]: I1216 09:07:58.897105 4823 generic.go:334] "Generic (PLEG): container finished" podID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerID="f36a353c66bec2426a6543a7c9be7673d22af695566b4903267ba28b9ec1e76a" exitCode=0 Dec 16 09:07:58 crc kubenswrapper[4823]: I1216 09:07:58.897232 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerDied","Data":"f36a353c66bec2426a6543a7c9be7673d22af695566b4903267ba28b9ec1e76a"} Dec 16 09:07:59 crc kubenswrapper[4823]: I1216 09:07:59.921328 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerStarted","Data":"89a2670cf718ab31b266c6424bfc66b8bfcac5fd22e626dc8ba58e5549ee3781"} Dec 16 09:07:59 crc kubenswrapper[4823]: I1216 09:07:59.926770 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerStarted","Data":"957f1733e3745c4bf160f023bc9a01f5bc95dbaca647e4576466831de86fe54b"} Dec 16 09:08:01 crc kubenswrapper[4823]: I1216 09:08:01.949859 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerStarted","Data":"3bf365a0fe620090601c8a14825e58042cc998b33052e1c6b7445f097370230d"} Dec 16 09:08:01 crc kubenswrapper[4823]: I1216 09:08:01.950563 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 09:08:01 crc kubenswrapper[4823]: I1216 09:08:01.971530 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.434136758 podStartE2EDuration="11.971507602s" podCreationTimestamp="2025-12-16 09:07:50 +0000 UTC" firstStartedPulling="2025-12-16 09:07:51.479081437 +0000 UTC m=+7949.967647560" lastFinishedPulling="2025-12-16 09:08:01.016452281 +0000 UTC m=+7959.505018404" observedRunningTime="2025-12-16 09:08:01.967493707 +0000 UTC m=+7960.456059830" watchObservedRunningTime="2025-12-16 09:08:01.971507602 +0000 UTC m=+7960.460073725" Dec 16 09:08:02 crc kubenswrapper[4823]: I1216 09:08:02.969182 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerStarted","Data":"b67bbe0ab1f251a9ee41d54b3f9217494a588f2c6e81c715f6504ef1a69b0fb0"} Dec 16 09:08:02 crc kubenswrapper[4823]: I1216 09:08:02.969513 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerStarted","Data":"be73fd31aef3c2646ca8ea8d7a1185806913357be9eccd871881753164e9eaff"} Dec 16 09:08:02 crc kubenswrapper[4823]: I1216 09:08:02.998187 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.998167295000002 podStartE2EDuration="16.998167295s" podCreationTimestamp="2025-12-16 09:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:08:02.992638532 +0000 UTC m=+7961.481204655" watchObservedRunningTime="2025-12-16 09:08:02.998167295 +0000 UTC m=+7961.486733408" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.343733 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-gr9jj"] Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.345090 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.358847 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-gr9jj"] Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.440400 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-f38e-account-create-update-kgfvh"] Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.441699 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.444080 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.476614 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-f38e-account-create-update-kgfvh"] Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.501178 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d47387-c96f-4154-be1c-eda89c0e2a77-operator-scripts\") pod \"aodh-db-create-gr9jj\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.502087 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4vdl\" (UniqueName: \"kubernetes.io/projected/64eb539a-acff-4e76-bdaa-db24b9abed39-kube-api-access-r4vdl\") pod \"aodh-f38e-account-create-update-kgfvh\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.502177 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64eb539a-acff-4e76-bdaa-db24b9abed39-operator-scripts\") pod \"aodh-f38e-account-create-update-kgfvh\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.502233 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7gr\" (UniqueName: \"kubernetes.io/projected/f9d47387-c96f-4154-be1c-eda89c0e2a77-kube-api-access-9q7gr\") pod \"aodh-db-create-gr9jj\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.604167 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4vdl\" (UniqueName: \"kubernetes.io/projected/64eb539a-acff-4e76-bdaa-db24b9abed39-kube-api-access-r4vdl\") pod \"aodh-f38e-account-create-update-kgfvh\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.604495 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64eb539a-acff-4e76-bdaa-db24b9abed39-operator-scripts\") pod \"aodh-f38e-account-create-update-kgfvh\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.604581 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7gr\" (UniqueName: \"kubernetes.io/projected/f9d47387-c96f-4154-be1c-eda89c0e2a77-kube-api-access-9q7gr\") pod \"aodh-db-create-gr9jj\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.604666 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d47387-c96f-4154-be1c-eda89c0e2a77-operator-scripts\") pod \"aodh-db-create-gr9jj\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.605573 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64eb539a-acff-4e76-bdaa-db24b9abed39-operator-scripts\") pod \"aodh-f38e-account-create-update-kgfvh\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.605799 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d47387-c96f-4154-be1c-eda89c0e2a77-operator-scripts\") pod \"aodh-db-create-gr9jj\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.627553 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4vdl\" (UniqueName: \"kubernetes.io/projected/64eb539a-acff-4e76-bdaa-db24b9abed39-kube-api-access-r4vdl\") pod \"aodh-f38e-account-create-update-kgfvh\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.635490 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7gr\" (UniqueName: \"kubernetes.io/projected/f9d47387-c96f-4154-be1c-eda89c0e2a77-kube-api-access-9q7gr\") pod \"aodh-db-create-gr9jj\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.668861 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:04 crc kubenswrapper[4823]: I1216 09:08:04.772493 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:05 crc kubenswrapper[4823]: I1216 09:08:05.251973 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-gr9jj"] Dec 16 09:08:05 crc kubenswrapper[4823]: W1216 09:08:05.260785 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9d47387_c96f_4154_be1c_eda89c0e2a77.slice/crio-daf2e07e5831cce78e46c8ea3f2e38d1d97e2d8c2368315d5ae6f831ed35fd3d WatchSource:0}: Error finding container daf2e07e5831cce78e46c8ea3f2e38d1d97e2d8c2368315d5ae6f831ed35fd3d: Status 404 returned error can't find the container with id daf2e07e5831cce78e46c8ea3f2e38d1d97e2d8c2368315d5ae6f831ed35fd3d Dec 16 09:08:05 crc kubenswrapper[4823]: I1216 09:08:05.432865 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-f38e-account-create-update-kgfvh"] Dec 16 09:08:05 crc kubenswrapper[4823]: I1216 09:08:05.995478 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f38e-account-create-update-kgfvh" event={"ID":"64eb539a-acff-4e76-bdaa-db24b9abed39","Type":"ContainerStarted","Data":"60f7cef83d7054fd046b15fe4e391677f350fc91768e0d0ca253e20b4eb08a06"} Dec 16 09:08:05 crc kubenswrapper[4823]: I1216 09:08:05.995835 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f38e-account-create-update-kgfvh" event={"ID":"64eb539a-acff-4e76-bdaa-db24b9abed39","Type":"ContainerStarted","Data":"4b9aca162215b43e97fac5f33b2fabb31fc6cb06f412d6d05ebe1e1f5ceadc90"} Dec 16 09:08:06 crc kubenswrapper[4823]: I1216 09:08:06.001078 4823 generic.go:334] "Generic (PLEG): container finished" podID="f9d47387-c96f-4154-be1c-eda89c0e2a77" containerID="e7c4b08b0b98afba7de5e397e6bee5b5f06c847966a35dc613331d6dafcf3b4f" exitCode=0 Dec 16 09:08:06 crc kubenswrapper[4823]: I1216 09:08:06.001185 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gr9jj" event={"ID":"f9d47387-c96f-4154-be1c-eda89c0e2a77","Type":"ContainerDied","Data":"e7c4b08b0b98afba7de5e397e6bee5b5f06c847966a35dc613331d6dafcf3b4f"} Dec 16 09:08:06 crc kubenswrapper[4823]: I1216 09:08:06.001522 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gr9jj" event={"ID":"f9d47387-c96f-4154-be1c-eda89c0e2a77","Type":"ContainerStarted","Data":"daf2e07e5831cce78e46c8ea3f2e38d1d97e2d8c2368315d5ae6f831ed35fd3d"} Dec 16 09:08:06 crc kubenswrapper[4823]: I1216 09:08:06.012752 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-f38e-account-create-update-kgfvh" podStartSLOduration=2.012728981 podStartE2EDuration="2.012728981s" podCreationTimestamp="2025-12-16 09:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:08:06.011219263 +0000 UTC m=+7964.499785386" watchObservedRunningTime="2025-12-16 09:08:06.012728981 +0000 UTC m=+7964.501295104" Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.012534 4823 generic.go:334] "Generic (PLEG): container finished" podID="64eb539a-acff-4e76-bdaa-db24b9abed39" containerID="60f7cef83d7054fd046b15fe4e391677f350fc91768e0d0ca253e20b4eb08a06" exitCode=0 Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.012609 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f38e-account-create-update-kgfvh" event={"ID":"64eb539a-acff-4e76-bdaa-db24b9abed39","Type":"ContainerDied","Data":"60f7cef83d7054fd046b15fe4e391677f350fc91768e0d0ca253e20b4eb08a06"} Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.102927 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.476116 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.571529 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d47387-c96f-4154-be1c-eda89c0e2a77-operator-scripts\") pod \"f9d47387-c96f-4154-be1c-eda89c0e2a77\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.571644 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q7gr\" (UniqueName: \"kubernetes.io/projected/f9d47387-c96f-4154-be1c-eda89c0e2a77-kube-api-access-9q7gr\") pod \"f9d47387-c96f-4154-be1c-eda89c0e2a77\" (UID: \"f9d47387-c96f-4154-be1c-eda89c0e2a77\") " Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.572390 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9d47387-c96f-4154-be1c-eda89c0e2a77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9d47387-c96f-4154-be1c-eda89c0e2a77" (UID: "f9d47387-c96f-4154-be1c-eda89c0e2a77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.580092 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d47387-c96f-4154-be1c-eda89c0e2a77-kube-api-access-9q7gr" (OuterVolumeSpecName: "kube-api-access-9q7gr") pod "f9d47387-c96f-4154-be1c-eda89c0e2a77" (UID: "f9d47387-c96f-4154-be1c-eda89c0e2a77"). InnerVolumeSpecName "kube-api-access-9q7gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.674814 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9d47387-c96f-4154-be1c-eda89c0e2a77-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:07 crc kubenswrapper[4823]: I1216 09:08:07.674860 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q7gr\" (UniqueName: \"kubernetes.io/projected/f9d47387-c96f-4154-be1c-eda89c0e2a77-kube-api-access-9q7gr\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.024845 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gr9jj" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.024853 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gr9jj" event={"ID":"f9d47387-c96f-4154-be1c-eda89c0e2a77","Type":"ContainerDied","Data":"daf2e07e5831cce78e46c8ea3f2e38d1d97e2d8c2368315d5ae6f831ed35fd3d"} Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.025556 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daf2e07e5831cce78e46c8ea3f2e38d1d97e2d8c2368315d5ae6f831ed35fd3d" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.458321 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.515840 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4vdl\" (UniqueName: \"kubernetes.io/projected/64eb539a-acff-4e76-bdaa-db24b9abed39-kube-api-access-r4vdl\") pod \"64eb539a-acff-4e76-bdaa-db24b9abed39\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.516012 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64eb539a-acff-4e76-bdaa-db24b9abed39-operator-scripts\") pod \"64eb539a-acff-4e76-bdaa-db24b9abed39\" (UID: \"64eb539a-acff-4e76-bdaa-db24b9abed39\") " Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.517077 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64eb539a-acff-4e76-bdaa-db24b9abed39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64eb539a-acff-4e76-bdaa-db24b9abed39" (UID: "64eb539a-acff-4e76-bdaa-db24b9abed39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.517547 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64eb539a-acff-4e76-bdaa-db24b9abed39-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.521311 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64eb539a-acff-4e76-bdaa-db24b9abed39-kube-api-access-r4vdl" (OuterVolumeSpecName: "kube-api-access-r4vdl") pod "64eb539a-acff-4e76-bdaa-db24b9abed39" (UID: "64eb539a-acff-4e76-bdaa-db24b9abed39"). InnerVolumeSpecName "kube-api-access-r4vdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:08 crc kubenswrapper[4823]: I1216 09:08:08.620283 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4vdl\" (UniqueName: \"kubernetes.io/projected/64eb539a-acff-4e76-bdaa-db24b9abed39-kube-api-access-r4vdl\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.041808 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f38e-account-create-update-kgfvh" event={"ID":"64eb539a-acff-4e76-bdaa-db24b9abed39","Type":"ContainerDied","Data":"4b9aca162215b43e97fac5f33b2fabb31fc6cb06f412d6d05ebe1e1f5ceadc90"} Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.041869 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b9aca162215b43e97fac5f33b2fabb31fc6cb06f412d6d05ebe1e1f5ceadc90" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.041938 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f38e-account-create-update-kgfvh" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.728432 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-hm4lm"] Dec 16 09:08:09 crc kubenswrapper[4823]: E1216 09:08:09.729281 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64eb539a-acff-4e76-bdaa-db24b9abed39" containerName="mariadb-account-create-update" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.729307 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="64eb539a-acff-4e76-bdaa-db24b9abed39" containerName="mariadb-account-create-update" Dec 16 09:08:09 crc kubenswrapper[4823]: E1216 09:08:09.729351 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d47387-c96f-4154-be1c-eda89c0e2a77" containerName="mariadb-database-create" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.729359 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d47387-c96f-4154-be1c-eda89c0e2a77" containerName="mariadb-database-create" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.729591 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d47387-c96f-4154-be1c-eda89c0e2a77" containerName="mariadb-database-create" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.729627 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="64eb539a-acff-4e76-bdaa-db24b9abed39" containerName="mariadb-account-create-update" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.730864 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.742252 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.742331 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.742425 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-c8hwr" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.742735 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.748285 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hm4lm"] Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.750328 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-config-data\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.750370 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-scripts\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.750537 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-combined-ca-bundle\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.750608 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lpmw\" (UniqueName: \"kubernetes.io/projected/d686a815-6ed0-4dbc-bccb-eae76386b548-kube-api-access-4lpmw\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.787806 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:08:09 crc kubenswrapper[4823]: E1216 09:08:09.788138 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.852175 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-config-data\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.852243 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-scripts\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.853964 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-combined-ca-bundle\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.854364 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lpmw\" (UniqueName: \"kubernetes.io/projected/d686a815-6ed0-4dbc-bccb-eae76386b548-kube-api-access-4lpmw\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.858695 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-config-data\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.859829 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-combined-ca-bundle\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.869503 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-scripts\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:09 crc kubenswrapper[4823]: I1216 09:08:09.875780 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lpmw\" (UniqueName: \"kubernetes.io/projected/d686a815-6ed0-4dbc-bccb-eae76386b548-kube-api-access-4lpmw\") pod \"aodh-db-sync-hm4lm\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:10 crc kubenswrapper[4823]: I1216 09:08:10.067870 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:10 crc kubenswrapper[4823]: I1216 09:08:10.597505 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hm4lm"] Dec 16 09:08:11 crc kubenswrapper[4823]: I1216 09:08:11.062495 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hm4lm" event={"ID":"d686a815-6ed0-4dbc-bccb-eae76386b548","Type":"ContainerStarted","Data":"45643971ceffd740c88d832427c270959e10df4fe612b8f5b971c58ad9f5c982"} Dec 16 09:08:16 crc kubenswrapper[4823]: I1216 09:08:16.115774 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hm4lm" event={"ID":"d686a815-6ed0-4dbc-bccb-eae76386b548","Type":"ContainerStarted","Data":"bfe81cfc2db040563bbadf5d2c6d9c8824ce12477bde8907e686f8f1ec7a24d6"} Dec 16 09:08:16 crc kubenswrapper[4823]: I1216 09:08:16.132520 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-hm4lm" podStartSLOduration=2.446047459 podStartE2EDuration="7.132503783s" podCreationTimestamp="2025-12-16 09:08:09 +0000 UTC" firstStartedPulling="2025-12-16 09:08:10.623800185 +0000 UTC m=+7969.112366308" lastFinishedPulling="2025-12-16 09:08:15.310256489 +0000 UTC m=+7973.798822632" observedRunningTime="2025-12-16 09:08:16.131518232 +0000 UTC m=+7974.620084355" watchObservedRunningTime="2025-12-16 09:08:16.132503783 +0000 UTC m=+7974.621069906" Dec 16 09:08:17 crc kubenswrapper[4823]: I1216 09:08:17.103923 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 16 09:08:17 crc kubenswrapper[4823]: I1216 09:08:17.109349 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 16 09:08:17 crc kubenswrapper[4823]: I1216 09:08:17.129138 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 16 09:08:18 crc kubenswrapper[4823]: I1216 09:08:18.135818 4823 generic.go:334] "Generic (PLEG): container finished" podID="d686a815-6ed0-4dbc-bccb-eae76386b548" containerID="bfe81cfc2db040563bbadf5d2c6d9c8824ce12477bde8907e686f8f1ec7a24d6" exitCode=0 Dec 16 09:08:18 crc kubenswrapper[4823]: I1216 09:08:18.135954 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hm4lm" event={"ID":"d686a815-6ed0-4dbc-bccb-eae76386b548","Type":"ContainerDied","Data":"bfe81cfc2db040563bbadf5d2c6d9c8824ce12477bde8907e686f8f1ec7a24d6"} Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.559971 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.683616 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-combined-ca-bundle\") pod \"d686a815-6ed0-4dbc-bccb-eae76386b548\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.683787 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-scripts\") pod \"d686a815-6ed0-4dbc-bccb-eae76386b548\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.683945 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lpmw\" (UniqueName: \"kubernetes.io/projected/d686a815-6ed0-4dbc-bccb-eae76386b548-kube-api-access-4lpmw\") pod \"d686a815-6ed0-4dbc-bccb-eae76386b548\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.683973 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-config-data\") pod \"d686a815-6ed0-4dbc-bccb-eae76386b548\" (UID: \"d686a815-6ed0-4dbc-bccb-eae76386b548\") " Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.689780 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-scripts" (OuterVolumeSpecName: "scripts") pod "d686a815-6ed0-4dbc-bccb-eae76386b548" (UID: "d686a815-6ed0-4dbc-bccb-eae76386b548"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.691281 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d686a815-6ed0-4dbc-bccb-eae76386b548-kube-api-access-4lpmw" (OuterVolumeSpecName: "kube-api-access-4lpmw") pod "d686a815-6ed0-4dbc-bccb-eae76386b548" (UID: "d686a815-6ed0-4dbc-bccb-eae76386b548"). InnerVolumeSpecName "kube-api-access-4lpmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.716169 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-config-data" (OuterVolumeSpecName: "config-data") pod "d686a815-6ed0-4dbc-bccb-eae76386b548" (UID: "d686a815-6ed0-4dbc-bccb-eae76386b548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.721961 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d686a815-6ed0-4dbc-bccb-eae76386b548" (UID: "d686a815-6ed0-4dbc-bccb-eae76386b548"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.786052 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.786325 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.786395 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lpmw\" (UniqueName: \"kubernetes.io/projected/d686a815-6ed0-4dbc-bccb-eae76386b548-kube-api-access-4lpmw\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:19 crc kubenswrapper[4823]: I1216 09:08:19.786454 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686a815-6ed0-4dbc-bccb-eae76386b548-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:20 crc kubenswrapper[4823]: I1216 09:08:20.157911 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hm4lm" event={"ID":"d686a815-6ed0-4dbc-bccb-eae76386b548","Type":"ContainerDied","Data":"45643971ceffd740c88d832427c270959e10df4fe612b8f5b971c58ad9f5c982"} Dec 16 09:08:20 crc kubenswrapper[4823]: I1216 09:08:20.158313 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45643971ceffd740c88d832427c270959e10df4fe612b8f5b971c58ad9f5c982" Dec 16 09:08:20 crc kubenswrapper[4823]: I1216 09:08:20.157991 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hm4lm" Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.020149 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.047387 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4d28-account-create-update-zfk4f"] Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.063682 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n2f4r"] Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.077162 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4d28-account-create-update-zfk4f"] Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.088215 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n2f4r"] Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.788876 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e289ad3-49d6-41ac-aa1f-36e839cb4dfe" path="/var/lib/kubelet/pods/4e289ad3-49d6-41ac-aa1f-36e839cb4dfe/volumes" Dec 16 09:08:21 crc kubenswrapper[4823]: I1216 09:08:21.790500 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eacf4276-abf0-43b1-b50b-31b9a98fd977" path="/var/lib/kubelet/pods/eacf4276-abf0-43b1-b50b-31b9a98fd977/volumes" Dec 16 09:08:22 crc kubenswrapper[4823]: I1216 09:08:22.771992 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:08:22 crc kubenswrapper[4823]: E1216 09:08:22.772612 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.817731 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 16 09:08:24 crc kubenswrapper[4823]: E1216 09:08:24.818756 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d686a815-6ed0-4dbc-bccb-eae76386b548" containerName="aodh-db-sync" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.818778 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d686a815-6ed0-4dbc-bccb-eae76386b548" containerName="aodh-db-sync" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.819012 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d686a815-6ed0-4dbc-bccb-eae76386b548" containerName="aodh-db-sync" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.821758 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.826282 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.826502 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.826621 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-c8hwr" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.849442 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.906322 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6nx\" (UniqueName: \"kubernetes.io/projected/e2679fa2-b047-41db-9cd1-8d5b860402f6-kube-api-access-2q6nx\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.906414 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.906444 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-config-data\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:24 crc kubenswrapper[4823]: I1216 09:08:24.906609 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-scripts\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.008185 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-scripts\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.008311 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6nx\" (UniqueName: \"kubernetes.io/projected/e2679fa2-b047-41db-9cd1-8d5b860402f6-kube-api-access-2q6nx\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.008365 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.008393 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-config-data\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.016827 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.020918 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-scripts\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.023435 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-config-data\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.041137 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6nx\" (UniqueName: \"kubernetes.io/projected/e2679fa2-b047-41db-9cd1-8d5b860402f6-kube-api-access-2q6nx\") pod \"aodh-0\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.169517 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:08:25 crc kubenswrapper[4823]: I1216 09:08:25.642728 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 16 09:08:25 crc kubenswrapper[4823]: W1216 09:08:25.660665 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2679fa2_b047_41db_9cd1_8d5b860402f6.slice/crio-b9e08acf766f4bc65963c60c70c87789201704b78b192b94ae52efa419a5152f WatchSource:0}: Error finding container b9e08acf766f4bc65963c60c70c87789201704b78b192b94ae52efa419a5152f: Status 404 returned error can't find the container with id b9e08acf766f4bc65963c60c70c87789201704b78b192b94ae52efa419a5152f Dec 16 09:08:26 crc kubenswrapper[4823]: I1216 09:08:26.217778 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerStarted","Data":"b9e08acf766f4bc65963c60c70c87789201704b78b192b94ae52efa419a5152f"} Dec 16 09:08:27 crc kubenswrapper[4823]: I1216 09:08:27.243602 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerStarted","Data":"c8e75d4b99a8c660cc95ace3135bd4c4c77013222edc8ed7574091892902bc00"} Dec 16 09:08:27 crc kubenswrapper[4823]: I1216 09:08:27.603071 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:27 crc kubenswrapper[4823]: I1216 09:08:27.603391 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-central-agent" containerID="cri-o://e4d3357dadc5bf821fe74af974375318994589e62a109936cb435cd8dc3db23d" gracePeriod=30 Dec 16 09:08:27 crc kubenswrapper[4823]: I1216 09:08:27.603525 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="proxy-httpd" containerID="cri-o://3bf365a0fe620090601c8a14825e58042cc998b33052e1c6b7445f097370230d" gracePeriod=30 Dec 16 09:08:27 crc kubenswrapper[4823]: I1216 09:08:27.603571 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="sg-core" containerID="cri-o://957f1733e3745c4bf160f023bc9a01f5bc95dbaca647e4576466831de86fe54b" gracePeriod=30 Dec 16 09:08:27 crc kubenswrapper[4823]: I1216 09:08:27.603599 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-notification-agent" containerID="cri-o://08400715c044e89aa8efc8a75cc7c1f9be395ad5c84aeac9db3852913f85966a" gracePeriod=30 Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.261281 4823 generic.go:334] "Generic (PLEG): container finished" podID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerID="3bf365a0fe620090601c8a14825e58042cc998b33052e1c6b7445f097370230d" exitCode=0 Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.261591 4823 generic.go:334] "Generic (PLEG): container finished" podID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerID="957f1733e3745c4bf160f023bc9a01f5bc95dbaca647e4576466831de86fe54b" exitCode=2 Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.261600 4823 generic.go:334] "Generic (PLEG): container finished" podID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerID="e4d3357dadc5bf821fe74af974375318994589e62a109936cb435cd8dc3db23d" exitCode=0 Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.261346 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerDied","Data":"3bf365a0fe620090601c8a14825e58042cc998b33052e1c6b7445f097370230d"} Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.261634 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerDied","Data":"957f1733e3745c4bf160f023bc9a01f5bc95dbaca647e4576466831de86fe54b"} Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.261666 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerDied","Data":"e4d3357dadc5bf821fe74af974375318994589e62a109936cb435cd8dc3db23d"} Dec 16 09:08:28 crc kubenswrapper[4823]: I1216 09:08:28.685344 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 16 09:08:29 crc kubenswrapper[4823]: I1216 09:08:29.276494 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerStarted","Data":"61073e9397ad6ff5fd78dd108e79cb990e08952612aefe1cb324550b15f782da"} Dec 16 09:08:30 crc kubenswrapper[4823]: I1216 09:08:30.290457 4823 generic.go:334] "Generic (PLEG): container finished" podID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerID="08400715c044e89aa8efc8a75cc7c1f9be395ad5c84aeac9db3852913f85966a" exitCode=0 Dec 16 09:08:30 crc kubenswrapper[4823]: I1216 09:08:30.290554 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerDied","Data":"08400715c044e89aa8efc8a75cc7c1f9be395ad5c84aeac9db3852913f85966a"} Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.112426 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.255645 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-run-httpd\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.256008 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7fz\" (UniqueName: \"kubernetes.io/projected/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-kube-api-access-jb7fz\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.256280 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.256800 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-log-httpd\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.256878 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-combined-ca-bundle\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.256947 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-config-data\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.256990 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-scripts\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.257016 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-sg-core-conf-yaml\") pod \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\" (UID: \"f2d2bd26-e3e1-42e8-b946-8dac79b49e51\") " Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.257499 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.257962 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.263937 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-kube-api-access-jb7fz" (OuterVolumeSpecName: "kube-api-access-jb7fz") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "kube-api-access-jb7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.268531 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-scripts" (OuterVolumeSpecName: "scripts") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.298281 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.313020 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerStarted","Data":"f4c73394752c0a82c15cd061ab029c7c21483a8ef491eb6768de605ade7f8c28"} Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.317510 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2d2bd26-e3e1-42e8-b946-8dac79b49e51","Type":"ContainerDied","Data":"fc5ab1515683761623d3c563f76764e1c437d98554edb65cabcf24ff58e05ea4"} Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.317565 4823 scope.go:117] "RemoveContainer" containerID="3bf365a0fe620090601c8a14825e58042cc998b33052e1c6b7445f097370230d" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.317772 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.355588 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.359949 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7fz\" (UniqueName: \"kubernetes.io/projected/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-kube-api-access-jb7fz\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.359982 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.359992 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.360000 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.360008 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.381961 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-config-data" (OuterVolumeSpecName: "config-data") pod "f2d2bd26-e3e1-42e8-b946-8dac79b49e51" (UID: "f2d2bd26-e3e1-42e8-b946-8dac79b49e51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.461626 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d2bd26-e3e1-42e8-b946-8dac79b49e51-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.469660 4823 scope.go:117] "RemoveContainer" containerID="957f1733e3745c4bf160f023bc9a01f5bc95dbaca647e4576466831de86fe54b" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.508946 4823 scope.go:117] "RemoveContainer" containerID="08400715c044e89aa8efc8a75cc7c1f9be395ad5c84aeac9db3852913f85966a" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.534521 4823 scope.go:117] "RemoveContainer" containerID="e4d3357dadc5bf821fe74af974375318994589e62a109936cb435cd8dc3db23d" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.668579 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.681050 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.693264 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:31 crc kubenswrapper[4823]: E1216 09:08:31.694018 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-central-agent" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694079 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-central-agent" Dec 16 09:08:31 crc kubenswrapper[4823]: E1216 09:08:31.694103 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-notification-agent" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694115 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-notification-agent" Dec 16 09:08:31 crc kubenswrapper[4823]: E1216 09:08:31.694148 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="proxy-httpd" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694156 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="proxy-httpd" Dec 16 09:08:31 crc kubenswrapper[4823]: E1216 09:08:31.694196 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="sg-core" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694211 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="sg-core" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694476 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-central-agent" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694504 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="ceilometer-notification-agent" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694522 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="proxy-httpd" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.694552 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" containerName="sg-core" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.697509 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.699684 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.702596 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.716182 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.822593 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d2bd26-e3e1-42e8-b946-8dac79b49e51" path="/var/lib/kubelet/pods/f2d2bd26-e3e1-42e8-b946-8dac79b49e51/volumes" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.869534 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-config-data\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.869584 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-scripts\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.869720 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.869744 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.869758 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-run-httpd\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.869942 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-log-httpd\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.870010 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2mfc\" (UniqueName: \"kubernetes.io/projected/df7298f6-3f8e-4b06-b97b-d017aa0f3208-kube-api-access-q2mfc\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.969848 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.971688 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-config-data\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.971853 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-scripts\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.972079 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.972180 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.972258 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-run-httpd\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.972396 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-log-httpd\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.972518 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2mfc\" (UniqueName: \"kubernetes.io/projected/df7298f6-3f8e-4b06-b97b-d017aa0f3208-kube-api-access-q2mfc\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.974842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-run-httpd\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.975244 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-log-httpd\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: E1216 09:08:31.976349 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-q2mfc log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="df7298f6-3f8e-4b06-b97b-d017aa0f3208" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.991124 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-scripts\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.991283 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:31 crc kubenswrapper[4823]: I1216 09:08:31.992652 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-config-data\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:31.997522 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.005136 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2mfc\" (UniqueName: \"kubernetes.io/projected/df7298f6-3f8e-4b06-b97b-d017aa0f3208-kube-api-access-q2mfc\") pod \"ceilometer-0\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " pod="openstack/ceilometer-0" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.020502 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.020822 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="931c8fa8-3d33-42d2-a505-9320bd5d3695" containerName="kube-state-metrics" containerID="cri-o://854537105ff5e271e8f51a7f717ce4928b3958183fae6adeab4fb337061e9d37" gracePeriod=30 Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.329887 4823 generic.go:334] "Generic (PLEG): container finished" podID="931c8fa8-3d33-42d2-a505-9320bd5d3695" containerID="854537105ff5e271e8f51a7f717ce4928b3958183fae6adeab4fb337061e9d37" exitCode=2 Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.329989 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.329995 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"931c8fa8-3d33-42d2-a505-9320bd5d3695","Type":"ContainerDied","Data":"854537105ff5e271e8f51a7f717ce4928b3958183fae6adeab4fb337061e9d37"} Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.373454 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482429 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-config-data\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482580 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-run-httpd\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482617 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-combined-ca-bundle\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482680 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-log-httpd\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482737 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-scripts\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482883 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-sg-core-conf-yaml\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.482964 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.483016 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2mfc\" (UniqueName: \"kubernetes.io/projected/df7298f6-3f8e-4b06-b97b-d017aa0f3208-kube-api-access-q2mfc\") pod \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\" (UID: \"df7298f6-3f8e-4b06-b97b-d017aa0f3208\") " Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.483150 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.493242 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.498302 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-scripts" (OuterVolumeSpecName: "scripts") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.498505 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7298f6-3f8e-4b06-b97b-d017aa0f3208-kube-api-access-q2mfc" (OuterVolumeSpecName: "kube-api-access-q2mfc") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "kube-api-access-q2mfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.500869 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.501492 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2mfc\" (UniqueName: \"kubernetes.io/projected/df7298f6-3f8e-4b06-b97b-d017aa0f3208-kube-api-access-q2mfc\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.501530 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.501543 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.501556 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df7298f6-3f8e-4b06-b97b-d017aa0f3208-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.501568 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.501579 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.502357 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-config-data" (OuterVolumeSpecName: "config-data") pod "df7298f6-3f8e-4b06-b97b-d017aa0f3208" (UID: "df7298f6-3f8e-4b06-b97b-d017aa0f3208"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:32 crc kubenswrapper[4823]: I1216 09:08:32.603379 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df7298f6-3f8e-4b06-b97b-d017aa0f3208-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.338132 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.389771 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.405307 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.422697 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.428517 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.430708 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.431260 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.436020 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623109 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-log-httpd\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623165 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623241 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-config-data\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623297 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623367 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-scripts\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623502 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g99bh\" (UniqueName: \"kubernetes.io/projected/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-kube-api-access-g99bh\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.623536 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-run-httpd\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.726175 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g99bh\" (UniqueName: \"kubernetes.io/projected/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-kube-api-access-g99bh\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.726476 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-run-httpd\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.726641 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-log-httpd\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.726731 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.726898 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-config-data\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.727094 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.727278 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-scripts\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.727660 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-run-httpd\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.727699 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-log-httpd\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.736661 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.737661 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-config-data\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.738827 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-scripts\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.743868 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.751003 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g99bh\" (UniqueName: \"kubernetes.io/projected/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-kube-api-access-g99bh\") pod \"ceilometer-0\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " pod="openstack/ceilometer-0" Dec 16 09:08:33 crc kubenswrapper[4823]: I1216 09:08:33.789617 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7298f6-3f8e-4b06-b97b-d017aa0f3208" path="/var/lib/kubelet/pods/df7298f6-3f8e-4b06-b97b-d017aa0f3208/volumes" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.045337 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.091045 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.223515 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.238560 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8plm2\" (UniqueName: \"kubernetes.io/projected/931c8fa8-3d33-42d2-a505-9320bd5d3695-kube-api-access-8plm2\") pod \"931c8fa8-3d33-42d2-a505-9320bd5d3695\" (UID: \"931c8fa8-3d33-42d2-a505-9320bd5d3695\") " Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.245156 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931c8fa8-3d33-42d2-a505-9320bd5d3695-kube-api-access-8plm2" (OuterVolumeSpecName: "kube-api-access-8plm2") pod "931c8fa8-3d33-42d2-a505-9320bd5d3695" (UID: "931c8fa8-3d33-42d2-a505-9320bd5d3695"). InnerVolumeSpecName "kube-api-access-8plm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.342007 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8plm2\" (UniqueName: \"kubernetes.io/projected/931c8fa8-3d33-42d2-a505-9320bd5d3695-kube-api-access-8plm2\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.368905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"931c8fa8-3d33-42d2-a505-9320bd5d3695","Type":"ContainerDied","Data":"1989210c092526f4d739b81ccc4b78a0d5216e98e7da4f328f13316f72f8b80a"} Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.368966 4823 scope.go:117] "RemoveContainer" containerID="854537105ff5e271e8f51a7f717ce4928b3958183fae6adeab4fb337061e9d37" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.369141 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.418868 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.435679 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.445626 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:08:34 crc kubenswrapper[4823]: E1216 09:08:34.446116 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931c8fa8-3d33-42d2-a505-9320bd5d3695" containerName="kube-state-metrics" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.446140 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="931c8fa8-3d33-42d2-a505-9320bd5d3695" containerName="kube-state-metrics" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.446417 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="931c8fa8-3d33-42d2-a505-9320bd5d3695" containerName="kube-state-metrics" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.447275 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.449278 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.450464 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.471207 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:08:34 crc kubenswrapper[4823]: W1216 09:08:34.522562 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod370e1f30_0d9b_4159_9b46_7ea71ab8dab8.slice/crio-7f8ef0124e650d93486b0d6767e203d5e835ee1ee216b982ea0544251c5e769b WatchSource:0}: Error finding container 7f8ef0124e650d93486b0d6767e203d5e835ee1ee216b982ea0544251c5e769b: Status 404 returned error can't find the container with id 7f8ef0124e650d93486b0d6767e203d5e835ee1ee216b982ea0544251c5e769b Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.524832 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.648211 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.648375 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.648421 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvv6\" (UniqueName: \"kubernetes.io/projected/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-api-access-dxvv6\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.648550 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.750755 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.750908 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.750938 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvv6\" (UniqueName: \"kubernetes.io/projected/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-api-access-dxvv6\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.750984 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.756348 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.757318 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.757737 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.772247 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:08:34 crc kubenswrapper[4823]: E1216 09:08:34.772665 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.781880 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvv6\" (UniqueName: \"kubernetes.io/projected/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-api-access-dxvv6\") pod \"kube-state-metrics-0\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " pod="openstack/kube-state-metrics-0" Dec 16 09:08:34 crc kubenswrapper[4823]: I1216 09:08:34.874898 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:08:35 crc kubenswrapper[4823]: I1216 09:08:35.385016 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerStarted","Data":"7f8ef0124e650d93486b0d6767e203d5e835ee1ee216b982ea0544251c5e769b"} Dec 16 09:08:35 crc kubenswrapper[4823]: I1216 09:08:35.802254 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931c8fa8-3d33-42d2-a505-9320bd5d3695" path="/var/lib/kubelet/pods/931c8fa8-3d33-42d2-a505-9320bd5d3695/volumes" Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.397743 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-api" containerID="cri-o://c8e75d4b99a8c660cc95ace3135bd4c4c77013222edc8ed7574091892902bc00" gracePeriod=30 Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.397816 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-notifier" containerID="cri-o://f4c73394752c0a82c15cd061ab029c7c21483a8ef491eb6768de605ade7f8c28" gracePeriod=30 Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.397856 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-listener" containerID="cri-o://80382f935a3e2fa6c3f1d2bbeb77879dd358990ba59d2489678a09c91900940c" gracePeriod=30 Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.397623 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerStarted","Data":"80382f935a3e2fa6c3f1d2bbeb77879dd358990ba59d2489678a09c91900940c"} Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.397997 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-evaluator" containerID="cri-o://61073e9397ad6ff5fd78dd108e79cb990e08952612aefe1cb324550b15f782da" gracePeriod=30 Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.399991 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerStarted","Data":"24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1"} Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.436120 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.154244958 podStartE2EDuration="12.436011901s" podCreationTimestamp="2025-12-16 09:08:24 +0000 UTC" firstStartedPulling="2025-12-16 09:08:25.669162526 +0000 UTC m=+7984.157728649" lastFinishedPulling="2025-12-16 09:08:35.950929459 +0000 UTC m=+7994.439495592" observedRunningTime="2025-12-16 09:08:36.424701867 +0000 UTC m=+7994.913267990" watchObservedRunningTime="2025-12-16 09:08:36.436011901 +0000 UTC m=+7994.924578024" Dec 16 09:08:36 crc kubenswrapper[4823]: I1216 09:08:36.473519 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:08:37 crc kubenswrapper[4823]: I1216 09:08:37.418313 4823 generic.go:334] "Generic (PLEG): container finished" podID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerID="61073e9397ad6ff5fd78dd108e79cb990e08952612aefe1cb324550b15f782da" exitCode=0 Dec 16 09:08:37 crc kubenswrapper[4823]: I1216 09:08:37.418601 4823 generic.go:334] "Generic (PLEG): container finished" podID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerID="c8e75d4b99a8c660cc95ace3135bd4c4c77013222edc8ed7574091892902bc00" exitCode=0 Dec 16 09:08:37 crc kubenswrapper[4823]: I1216 09:08:37.418653 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerDied","Data":"61073e9397ad6ff5fd78dd108e79cb990e08952612aefe1cb324550b15f782da"} Dec 16 09:08:37 crc kubenswrapper[4823]: I1216 09:08:37.418684 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerDied","Data":"c8e75d4b99a8c660cc95ace3135bd4c4c77013222edc8ed7574091892902bc00"} Dec 16 09:08:37 crc kubenswrapper[4823]: I1216 09:08:37.427215 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ee97b1f-ce61-45ef-97e1-642cc13ef521","Type":"ContainerStarted","Data":"50fa1656455665eb191592ec56027f7b91faa2b2f26b2a004c1a13b50e4fa632"} Dec 16 09:08:38 crc kubenswrapper[4823]: I1216 09:08:38.455569 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerStarted","Data":"0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a"} Dec 16 09:08:38 crc kubenswrapper[4823]: I1216 09:08:38.456122 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerStarted","Data":"aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071"} Dec 16 09:08:38 crc kubenswrapper[4823]: I1216 09:08:38.459766 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ee97b1f-ce61-45ef-97e1-642cc13ef521","Type":"ContainerStarted","Data":"9c2fb0e25d5692eb7a90933e0f8cf60671619d23ad83cd29dd61e8449d4f5dfe"} Dec 16 09:08:38 crc kubenswrapper[4823]: I1216 09:08:38.459992 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 09:08:38 crc kubenswrapper[4823]: I1216 09:08:38.489948 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.597935506 podStartE2EDuration="4.489925613s" podCreationTimestamp="2025-12-16 09:08:34 +0000 UTC" firstStartedPulling="2025-12-16 09:08:36.51745214 +0000 UTC m=+7995.006018263" lastFinishedPulling="2025-12-16 09:08:37.409442247 +0000 UTC m=+7995.898008370" observedRunningTime="2025-12-16 09:08:38.484787642 +0000 UTC m=+7996.973353765" watchObservedRunningTime="2025-12-16 09:08:38.489925613 +0000 UTC m=+7996.978491746" Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.495249 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerStarted","Data":"386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54"} Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.496005 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.495530 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="proxy-httpd" containerID="cri-o://386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54" gracePeriod=30 Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.495617 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-notification-agent" containerID="cri-o://aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071" gracePeriod=30 Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.495639 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="sg-core" containerID="cri-o://0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a" gracePeriod=30 Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.495507 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-central-agent" containerID="cri-o://24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1" gracePeriod=30 Dec 16 09:08:41 crc kubenswrapper[4823]: I1216 09:08:41.527289 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.897078255 podStartE2EDuration="8.527264573s" podCreationTimestamp="2025-12-16 09:08:33 +0000 UTC" firstStartedPulling="2025-12-16 09:08:34.525719533 +0000 UTC m=+7993.014285656" lastFinishedPulling="2025-12-16 09:08:41.155905851 +0000 UTC m=+7999.644471974" observedRunningTime="2025-12-16 09:08:41.520211733 +0000 UTC m=+8000.008777856" watchObservedRunningTime="2025-12-16 09:08:41.527264573 +0000 UTC m=+8000.015830696" Dec 16 09:08:42 crc kubenswrapper[4823]: I1216 09:08:42.507740 4823 generic.go:334] "Generic (PLEG): container finished" podID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerID="386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54" exitCode=1 Dec 16 09:08:42 crc kubenswrapper[4823]: I1216 09:08:42.507994 4823 generic.go:334] "Generic (PLEG): container finished" podID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerID="0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a" exitCode=2 Dec 16 09:08:42 crc kubenswrapper[4823]: I1216 09:08:42.507816 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerDied","Data":"386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54"} Dec 16 09:08:42 crc kubenswrapper[4823]: I1216 09:08:42.508035 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerDied","Data":"0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a"} Dec 16 09:08:44 crc kubenswrapper[4823]: I1216 09:08:44.534347 4823 generic.go:334] "Generic (PLEG): container finished" podID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerID="aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071" exitCode=0 Dec 16 09:08:44 crc kubenswrapper[4823]: I1216 09:08:44.534428 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerDied","Data":"aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071"} Dec 16 09:08:44 crc kubenswrapper[4823]: I1216 09:08:44.901968 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.545867 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.562963 4823 generic.go:334] "Generic (PLEG): container finished" podID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerID="24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1" exitCode=0 Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.563008 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerDied","Data":"24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1"} Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.563055 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.563071 4823 scope.go:117] "RemoveContainer" containerID="386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.563058 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"370e1f30-0d9b-4159-9b46-7ea71ab8dab8","Type":"ContainerDied","Data":"7f8ef0124e650d93486b0d6767e203d5e835ee1ee216b982ea0544251c5e769b"} Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.616604 4823 scope.go:117] "RemoveContainer" containerID="0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.637866 4823 scope.go:117] "RemoveContainer" containerID="aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649347 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-sg-core-conf-yaml\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649417 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-config-data\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649455 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-run-httpd\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649634 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-combined-ca-bundle\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649689 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g99bh\" (UniqueName: \"kubernetes.io/projected/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-kube-api-access-g99bh\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649871 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-log-httpd\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.649937 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-scripts\") pod \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\" (UID: \"370e1f30-0d9b-4159-9b46-7ea71ab8dab8\") " Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.650851 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.651647 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.651873 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.655730 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-kube-api-access-g99bh" (OuterVolumeSpecName: "kube-api-access-g99bh") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "kube-api-access-g99bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.656669 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-scripts" (OuterVolumeSpecName: "scripts") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.661960 4823 scope.go:117] "RemoveContainer" containerID="24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.685677 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.731320 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.753348 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.753500 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g99bh\" (UniqueName: \"kubernetes.io/projected/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-kube-api-access-g99bh\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.753598 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.753674 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.753758 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.757709 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-config-data" (OuterVolumeSpecName: "config-data") pod "370e1f30-0d9b-4159-9b46-7ea71ab8dab8" (UID: "370e1f30-0d9b-4159-9b46-7ea71ab8dab8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.778143 4823 scope.go:117] "RemoveContainer" containerID="386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.780370 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54\": container with ID starting with 386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54 not found: ID does not exist" containerID="386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.780422 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54"} err="failed to get container status \"386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54\": rpc error: code = NotFound desc = could not find container \"386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54\": container with ID starting with 386e50e5744c1289e0e03578c5b3e0459f876009e0b86f537d9698aa4cb87d54 not found: ID does not exist" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.780447 4823 scope.go:117] "RemoveContainer" containerID="0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.780784 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a\": container with ID starting with 0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a not found: ID does not exist" containerID="0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.780821 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a"} err="failed to get container status \"0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a\": rpc error: code = NotFound desc = could not find container \"0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a\": container with ID starting with 0f6ef5dc44af3437d88db4814b9862c82d377eca5ec7d243c3b3b626762fbb9a not found: ID does not exist" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.780841 4823 scope.go:117] "RemoveContainer" containerID="aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.781087 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071\": container with ID starting with aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071 not found: ID does not exist" containerID="aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.781106 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071"} err="failed to get container status \"aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071\": rpc error: code = NotFound desc = could not find container \"aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071\": container with ID starting with aed236dd50ece8462019c89a489d6ac0a2f1fc838a376f4e3b62f39ef136c071 not found: ID does not exist" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.781118 4823 scope.go:117] "RemoveContainer" containerID="24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.781409 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1\": container with ID starting with 24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1 not found: ID does not exist" containerID="24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.781465 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1"} err="failed to get container status \"24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1\": rpc error: code = NotFound desc = could not find container \"24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1\": container with ID starting with 24789f257585b701d6c4c49fa7898eb588c41d3b730e43181119d9a418b06ac1 not found: ID does not exist" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.856722 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/370e1f30-0d9b-4159-9b46-7ea71ab8dab8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.888611 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.907392 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.921984 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.922518 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="sg-core" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922537 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="sg-core" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.922552 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-notification-agent" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922559 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-notification-agent" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.922574 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-central-agent" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922580 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-central-agent" Dec 16 09:08:47 crc kubenswrapper[4823]: E1216 09:08:47.922606 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="proxy-httpd" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922613 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="proxy-httpd" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922817 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="sg-core" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922838 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="proxy-httpd" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922855 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-notification-agent" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.922862 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" containerName="ceilometer-central-agent" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.924735 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.927165 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.927430 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.930101 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.936403 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.958675 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.958732 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-scripts\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.958786 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-log-httpd\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.958809 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-config-data\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.958904 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-run-httpd\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.958962 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.959084 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:47 crc kubenswrapper[4823]: I1216 09:08:47.959299 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q46t\" (UniqueName: \"kubernetes.io/projected/91080e73-6479-4c8b-bb2f-decdc0ade67e-kube-api-access-9q46t\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060583 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q46t\" (UniqueName: \"kubernetes.io/projected/91080e73-6479-4c8b-bb2f-decdc0ade67e-kube-api-access-9q46t\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060706 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060734 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-scripts\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060772 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-log-httpd\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060797 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-config-data\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060852 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-run-httpd\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060893 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.060963 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.061822 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-run-httpd\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.061842 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-log-httpd\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.064585 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.065075 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.065694 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-config-data\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.066252 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.067876 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-scripts\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.076901 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q46t\" (UniqueName: \"kubernetes.io/projected/91080e73-6479-4c8b-bb2f-decdc0ade67e-kube-api-access-9q46t\") pod \"ceilometer-0\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.255389 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:08:48 crc kubenswrapper[4823]: I1216 09:08:48.707729 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:08:48 crc kubenswrapper[4823]: W1216 09:08:48.709673 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91080e73_6479_4c8b_bb2f_decdc0ade67e.slice/crio-380e1a79f4dcc3d646b1cd42e7195b74422fd4276d78617d7f6a8702db85cfa1 WatchSource:0}: Error finding container 380e1a79f4dcc3d646b1cd42e7195b74422fd4276d78617d7f6a8702db85cfa1: Status 404 returned error can't find the container with id 380e1a79f4dcc3d646b1cd42e7195b74422fd4276d78617d7f6a8702db85cfa1 Dec 16 09:08:49 crc kubenswrapper[4823]: I1216 09:08:49.589900 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerStarted","Data":"18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807"} Dec 16 09:08:49 crc kubenswrapper[4823]: I1216 09:08:49.590294 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerStarted","Data":"380e1a79f4dcc3d646b1cd42e7195b74422fd4276d78617d7f6a8702db85cfa1"} Dec 16 09:08:49 crc kubenswrapper[4823]: I1216 09:08:49.772351 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:08:49 crc kubenswrapper[4823]: E1216 09:08:49.772636 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:08:49 crc kubenswrapper[4823]: I1216 09:08:49.788274 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370e1f30-0d9b-4159-9b46-7ea71ab8dab8" path="/var/lib/kubelet/pods/370e1f30-0d9b-4159-9b46-7ea71ab8dab8/volumes" Dec 16 09:08:51 crc kubenswrapper[4823]: I1216 09:08:51.524060 4823 scope.go:117] "RemoveContainer" containerID="24328ddc822f8342c18736f115585ba59a7d68d44674cfb10d0d532a099ae439" Dec 16 09:08:51 crc kubenswrapper[4823]: I1216 09:08:51.545970 4823 scope.go:117] "RemoveContainer" containerID="68397253796aa62e916c2caa7a1ff23231549f5a3cf2a3f3169458d72a61f5fd" Dec 16 09:08:52 crc kubenswrapper[4823]: I1216 09:08:52.645776 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerStarted","Data":"79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66"} Dec 16 09:08:56 crc kubenswrapper[4823]: I1216 09:08:56.681635 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerStarted","Data":"d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3"} Dec 16 09:09:00 crc kubenswrapper[4823]: I1216 09:09:00.082587 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-52nj6"] Dec 16 09:09:00 crc kubenswrapper[4823]: I1216 09:09:00.099264 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-52nj6"] Dec 16 09:09:01 crc kubenswrapper[4823]: I1216 09:09:01.787532 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15835b7-f5d1-4b98-aeff-14eea3529691" path="/var/lib/kubelet/pods/d15835b7-f5d1-4b98-aeff-14eea3529691/volumes" Dec 16 09:09:02 crc kubenswrapper[4823]: I1216 09:09:02.772391 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:09:02 crc kubenswrapper[4823]: E1216 09:09:02.772647 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:09:05 crc kubenswrapper[4823]: I1216 09:09:05.785314 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 09:09:05 crc kubenswrapper[4823]: I1216 09:09:05.785646 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerStarted","Data":"dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345"} Dec 16 09:09:05 crc kubenswrapper[4823]: I1216 09:09:05.817836 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.696546403 podStartE2EDuration="18.817816747s" podCreationTimestamp="2025-12-16 09:08:47 +0000 UTC" firstStartedPulling="2025-12-16 09:08:48.71174473 +0000 UTC m=+8007.200310853" lastFinishedPulling="2025-12-16 09:09:04.833015074 +0000 UTC m=+8023.321581197" observedRunningTime="2025-12-16 09:09:05.815199625 +0000 UTC m=+8024.303765758" watchObservedRunningTime="2025-12-16 09:09:05.817816747 +0000 UTC m=+8024.306382880" Dec 16 09:09:07 crc kubenswrapper[4823]: I1216 09:09:07.809157 4823 generic.go:334] "Generic (PLEG): container finished" podID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerID="80382f935a3e2fa6c3f1d2bbeb77879dd358990ba59d2489678a09c91900940c" exitCode=137 Dec 16 09:09:07 crc kubenswrapper[4823]: I1216 09:09:07.809425 4823 generic.go:334] "Generic (PLEG): container finished" podID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerID="f4c73394752c0a82c15cd061ab029c7c21483a8ef491eb6768de605ade7f8c28" exitCode=137 Dec 16 09:09:07 crc kubenswrapper[4823]: I1216 09:09:07.809251 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerDied","Data":"80382f935a3e2fa6c3f1d2bbeb77879dd358990ba59d2489678a09c91900940c"} Dec 16 09:09:07 crc kubenswrapper[4823]: I1216 09:09:07.809650 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerDied","Data":"f4c73394752c0a82c15cd061ab029c7c21483a8ef491eb6768de605ade7f8c28"} Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.489170 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.619780 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-combined-ca-bundle\") pod \"e2679fa2-b047-41db-9cd1-8d5b860402f6\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.620187 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-config-data\") pod \"e2679fa2-b047-41db-9cd1-8d5b860402f6\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.620214 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-scripts\") pod \"e2679fa2-b047-41db-9cd1-8d5b860402f6\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.620278 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6nx\" (UniqueName: \"kubernetes.io/projected/e2679fa2-b047-41db-9cd1-8d5b860402f6-kube-api-access-2q6nx\") pod \"e2679fa2-b047-41db-9cd1-8d5b860402f6\" (UID: \"e2679fa2-b047-41db-9cd1-8d5b860402f6\") " Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.625566 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-scripts" (OuterVolumeSpecName: "scripts") pod "e2679fa2-b047-41db-9cd1-8d5b860402f6" (UID: "e2679fa2-b047-41db-9cd1-8d5b860402f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.627202 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2679fa2-b047-41db-9cd1-8d5b860402f6-kube-api-access-2q6nx" (OuterVolumeSpecName: "kube-api-access-2q6nx") pod "e2679fa2-b047-41db-9cd1-8d5b860402f6" (UID: "e2679fa2-b047-41db-9cd1-8d5b860402f6"). InnerVolumeSpecName "kube-api-access-2q6nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.723112 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.723142 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6nx\" (UniqueName: \"kubernetes.io/projected/e2679fa2-b047-41db-9cd1-8d5b860402f6-kube-api-access-2q6nx\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.725211 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-config-data" (OuterVolumeSpecName: "config-data") pod "e2679fa2-b047-41db-9cd1-8d5b860402f6" (UID: "e2679fa2-b047-41db-9cd1-8d5b860402f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.741522 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2679fa2-b047-41db-9cd1-8d5b860402f6" (UID: "e2679fa2-b047-41db-9cd1-8d5b860402f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.823969 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.824559 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.824597 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2679fa2-b047-41db-9cd1-8d5b860402f6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.824897 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e2679fa2-b047-41db-9cd1-8d5b860402f6","Type":"ContainerDied","Data":"b9e08acf766f4bc65963c60c70c87789201704b78b192b94ae52efa419a5152f"} Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.825018 4823 scope.go:117] "RemoveContainer" containerID="80382f935a3e2fa6c3f1d2bbeb77879dd358990ba59d2489678a09c91900940c" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.858780 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.865135 4823 scope.go:117] "RemoveContainer" containerID="f4c73394752c0a82c15cd061ab029c7c21483a8ef491eb6768de605ade7f8c28" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.866728 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.896243 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 16 09:09:08 crc kubenswrapper[4823]: E1216 09:09:08.896831 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-notifier" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.896918 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-notifier" Dec 16 09:09:08 crc kubenswrapper[4823]: E1216 09:09:08.896987 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-listener" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897052 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-listener" Dec 16 09:09:08 crc kubenswrapper[4823]: E1216 09:09:08.897109 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-evaluator" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897159 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-evaluator" Dec 16 09:09:08 crc kubenswrapper[4823]: E1216 09:09:08.897215 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-api" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897298 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-api" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897558 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-notifier" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897617 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-api" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897679 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-listener" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.897746 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" containerName="aodh-evaluator" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.899629 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.903321 4823 scope.go:117] "RemoveContainer" containerID="61073e9397ad6ff5fd78dd108e79cb990e08952612aefe1cb324550b15f782da" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.905721 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.905986 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-c8hwr" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.906411 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.906997 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.909492 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.917613 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 16 09:09:08 crc kubenswrapper[4823]: I1216 09:09:08.940940 4823 scope.go:117] "RemoveContainer" containerID="c8e75d4b99a8c660cc95ace3135bd4c4c77013222edc8ed7574091892902bc00" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.028729 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-scripts\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.028810 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-public-tls-certs\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.029002 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxxf\" (UniqueName: \"kubernetes.io/projected/34b62d72-52bc-4a7d-806c-52784476a695-kube-api-access-2fxxf\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.029053 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-combined-ca-bundle\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.029373 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-config-data\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.029442 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-internal-tls-certs\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.132043 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxxf\" (UniqueName: \"kubernetes.io/projected/34b62d72-52bc-4a7d-806c-52784476a695-kube-api-access-2fxxf\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.132100 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-combined-ca-bundle\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.132201 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-config-data\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.132229 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-internal-tls-certs\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.132254 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-scripts\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.132292 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-public-tls-certs\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.135911 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-public-tls-certs\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.136196 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-scripts\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.137542 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-config-data\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.140398 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-internal-tls-certs\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.140771 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-combined-ca-bundle\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.151709 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxxf\" (UniqueName: \"kubernetes.io/projected/34b62d72-52bc-4a7d-806c-52784476a695-kube-api-access-2fxxf\") pod \"aodh-0\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.227842 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.699627 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 16 09:09:09 crc kubenswrapper[4823]: W1216 09:09:09.706194 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b62d72_52bc_4a7d_806c_52784476a695.slice/crio-1ae969fa531745e02c6a805f42f3e16e6ef6cb8548d55c4daca838d7619357b6 WatchSource:0}: Error finding container 1ae969fa531745e02c6a805f42f3e16e6ef6cb8548d55c4daca838d7619357b6: Status 404 returned error can't find the container with id 1ae969fa531745e02c6a805f42f3e16e6ef6cb8548d55c4daca838d7619357b6 Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.786996 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2679fa2-b047-41db-9cd1-8d5b860402f6" path="/var/lib/kubelet/pods/e2679fa2-b047-41db-9cd1-8d5b860402f6/volumes" Dec 16 09:09:09 crc kubenswrapper[4823]: I1216 09:09:09.838734 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerStarted","Data":"1ae969fa531745e02c6a805f42f3e16e6ef6cb8548d55c4daca838d7619357b6"} Dec 16 09:09:10 crc kubenswrapper[4823]: I1216 09:09:10.853197 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerStarted","Data":"edfff901e422667feb8df9942487dc99f3781c67ee58c02cc9599524e02a462f"} Dec 16 09:09:14 crc kubenswrapper[4823]: I1216 09:09:14.771943 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:09:14 crc kubenswrapper[4823]: E1216 09:09:14.773719 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:09:15 crc kubenswrapper[4823]: I1216 09:09:15.904617 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerStarted","Data":"69a1928930a4c71ac0b712b8fdbc30fc7cd0ea0a10daa4d5d941a2057c2caf2e"} Dec 16 09:09:17 crc kubenswrapper[4823]: I1216 09:09:17.933148 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerStarted","Data":"34e5dedd3f4eeec3f9dfff67c9bc1eb3a9095430a483155bc57be31d63b3460b"} Dec 16 09:09:18 crc kubenswrapper[4823]: I1216 09:09:18.283444 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 09:09:19 crc kubenswrapper[4823]: I1216 09:09:19.955650 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerStarted","Data":"8d06e3290f385b94ea1f27ba2ac87da8630c1786706af2c1115d30b9f2ec0dd7"} Dec 16 09:09:19 crc kubenswrapper[4823]: I1216 09:09:19.991826 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.688722355 podStartE2EDuration="11.991466937s" podCreationTimestamp="2025-12-16 09:09:08 +0000 UTC" firstStartedPulling="2025-12-16 09:09:09.713772271 +0000 UTC m=+8028.202338394" lastFinishedPulling="2025-12-16 09:09:19.016516853 +0000 UTC m=+8037.505082976" observedRunningTime="2025-12-16 09:09:19.98868836 +0000 UTC m=+8038.477254503" watchObservedRunningTime="2025-12-16 09:09:19.991466937 +0000 UTC m=+8038.480033070" Dec 16 09:09:29 crc kubenswrapper[4823]: I1216 09:09:29.773390 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:09:30 crc kubenswrapper[4823]: I1216 09:09:30.062682 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2f37-account-create-update-27jn8"] Dec 16 09:09:30 crc kubenswrapper[4823]: I1216 09:09:30.063572 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"f15e2940e276ab8f86caed82382b79b3b000ee860b1e320554c03e2678c0b4b5"} Dec 16 09:09:30 crc kubenswrapper[4823]: I1216 09:09:30.074213 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-njwbc"] Dec 16 09:09:30 crc kubenswrapper[4823]: I1216 09:09:30.086218 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2f37-account-create-update-27jn8"] Dec 16 09:09:30 crc kubenswrapper[4823]: I1216 09:09:30.104128 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-njwbc"] Dec 16 09:09:31 crc kubenswrapper[4823]: I1216 09:09:31.786140 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f5794c-ab92-40a6-8e97-34a6cbda2f1c" path="/var/lib/kubelet/pods/b3f5794c-ab92-40a6-8e97-34a6cbda2f1c/volumes" Dec 16 09:09:31 crc kubenswrapper[4823]: I1216 09:09:31.787547 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf332295-2a27-4f19-bea3-51ca1596e5c0" path="/var/lib/kubelet/pods/bf332295-2a27-4f19-bea3-51ca1596e5c0/volumes" Dec 16 09:09:39 crc kubenswrapper[4823]: I1216 09:09:39.035887 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2gq2s"] Dec 16 09:09:39 crc kubenswrapper[4823]: I1216 09:09:39.048755 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2gq2s"] Dec 16 09:09:39 crc kubenswrapper[4823]: I1216 09:09:39.786289 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc2db84-0d65-4fef-90a7-3051d568430b" path="/var/lib/kubelet/pods/dbc2db84-0d65-4fef-90a7-3051d568430b/volumes" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.299150 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-25bwx"] Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.300842 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.303422 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.303425 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312615 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-ring-data-devices\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312707 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-scripts\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312747 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-etc-swift\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312851 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-combined-ca-bundle\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312880 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqddq\" (UniqueName: \"kubernetes.io/projected/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-kube-api-access-bqddq\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312902 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-dispersionconf\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.312940 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-swiftconf\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.321439 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-25bwx"] Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415309 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-scripts\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415376 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-etc-swift\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415486 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-combined-ca-bundle\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415517 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqddq\" (UniqueName: \"kubernetes.io/projected/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-kube-api-access-bqddq\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415542 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-dispersionconf\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415579 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-swiftconf\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415664 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-ring-data-devices\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.415932 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-etc-swift\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.416285 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-scripts\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.416369 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-ring-data-devices\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.422919 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-dispersionconf\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.423456 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-swiftconf\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.423978 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-combined-ca-bundle\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.437431 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqddq\" (UniqueName: \"kubernetes.io/projected/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-kube-api-access-bqddq\") pod \"swift-ring-rebalance-25bwx\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:47 crc kubenswrapper[4823]: I1216 09:09:47.624792 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:48 crc kubenswrapper[4823]: I1216 09:09:48.143814 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-25bwx"] Dec 16 09:09:48 crc kubenswrapper[4823]: I1216 09:09:48.252059 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25bwx" event={"ID":"b99d99db-98d9-45ca-ab9e-f8a56b5e936b","Type":"ContainerStarted","Data":"2a7c6b2c65fb60cffda28c465b8307344c51fb11fe2132dccb18ad5587916299"} Dec 16 09:09:49 crc kubenswrapper[4823]: I1216 09:09:49.263004 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25bwx" event={"ID":"b99d99db-98d9-45ca-ab9e-f8a56b5e936b","Type":"ContainerStarted","Data":"5495bb782a9171cd86d5cb16369710c1a6d2b47a6f3e5031d1a6ab602dacd3d3"} Dec 16 09:09:49 crc kubenswrapper[4823]: I1216 09:09:49.286118 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-25bwx" podStartSLOduration=2.286098932 podStartE2EDuration="2.286098932s" podCreationTimestamp="2025-12-16 09:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:09:49.279800825 +0000 UTC m=+8067.768366958" watchObservedRunningTime="2025-12-16 09:09:49.286098932 +0000 UTC m=+8067.774665055" Dec 16 09:09:51 crc kubenswrapper[4823]: I1216 09:09:51.680562 4823 scope.go:117] "RemoveContainer" containerID="4ac3dd81ea923b63025aca3825b2fb8c65b13d2c45a0c4c97d745de3e54dad1d" Dec 16 09:09:51 crc kubenswrapper[4823]: I1216 09:09:51.709007 4823 scope.go:117] "RemoveContainer" containerID="0b933de89f5d1527a81f1500b2cabbf12c3dcbf034af50a55cc3f912ac36c32a" Dec 16 09:09:51 crc kubenswrapper[4823]: I1216 09:09:51.774652 4823 scope.go:117] "RemoveContainer" containerID="e2ef74bd152894e3c34c3052ee9a50d3a4d09fc64d0cd690fa05f9c1d68b0731" Dec 16 09:09:51 crc kubenswrapper[4823]: I1216 09:09:51.837916 4823 scope.go:117] "RemoveContainer" containerID="be1087a76e2e8a3ff26a604d9bff13fbda716ff8f0816b5e30d225f77d136a04" Dec 16 09:09:54 crc kubenswrapper[4823]: I1216 09:09:54.316519 4823 generic.go:334] "Generic (PLEG): container finished" podID="b99d99db-98d9-45ca-ab9e-f8a56b5e936b" containerID="5495bb782a9171cd86d5cb16369710c1a6d2b47a6f3e5031d1a6ab602dacd3d3" exitCode=0 Dec 16 09:09:54 crc kubenswrapper[4823]: I1216 09:09:54.316615 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25bwx" event={"ID":"b99d99db-98d9-45ca-ab9e-f8a56b5e936b","Type":"ContainerDied","Data":"5495bb782a9171cd86d5cb16369710c1a6d2b47a6f3e5031d1a6ab602dacd3d3"} Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.707914 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.721905 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-ring-data-devices\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722081 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqddq\" (UniqueName: \"kubernetes.io/projected/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-kube-api-access-bqddq\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722139 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-dispersionconf\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722258 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-etc-swift\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722361 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-swiftconf\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722414 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-scripts\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722455 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-combined-ca-bundle\") pod \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\" (UID: \"b99d99db-98d9-45ca-ab9e-f8a56b5e936b\") " Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.722520 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.723145 4823 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.723309 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.732131 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-kube-api-access-bqddq" (OuterVolumeSpecName: "kube-api-access-bqddq") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "kube-api-access-bqddq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.757016 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.763690 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.780266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-scripts" (OuterVolumeSpecName: "scripts") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.784100 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b99d99db-98d9-45ca-ab9e-f8a56b5e936b" (UID: "b99d99db-98d9-45ca-ab9e-f8a56b5e936b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.825089 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.825123 4823 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.825132 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.825140 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.825151 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqddq\" (UniqueName: \"kubernetes.io/projected/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-kube-api-access-bqddq\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:55 crc kubenswrapper[4823]: I1216 09:09:55.825159 4823 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b99d99db-98d9-45ca-ab9e-f8a56b5e936b-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:56 crc kubenswrapper[4823]: I1216 09:09:56.334347 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25bwx" event={"ID":"b99d99db-98d9-45ca-ab9e-f8a56b5e936b","Type":"ContainerDied","Data":"2a7c6b2c65fb60cffda28c465b8307344c51fb11fe2132dccb18ad5587916299"} Dec 16 09:09:56 crc kubenswrapper[4823]: I1216 09:09:56.334390 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a7c6b2c65fb60cffda28c465b8307344c51fb11fe2132dccb18ad5587916299" Dec 16 09:09:56 crc kubenswrapper[4823]: I1216 09:09:56.334460 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25bwx" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.458120 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kpwfq"] Dec 16 09:10:39 crc kubenswrapper[4823]: E1216 09:10:39.459205 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99d99db-98d9-45ca-ab9e-f8a56b5e936b" containerName="swift-ring-rebalance" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.459221 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99d99db-98d9-45ca-ab9e-f8a56b5e936b" containerName="swift-ring-rebalance" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.459422 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99d99db-98d9-45ca-ab9e-f8a56b5e936b" containerName="swift-ring-rebalance" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.460826 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.485537 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpwfq"] Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.490965 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-catalog-content\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.491282 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxb5\" (UniqueName: \"kubernetes.io/projected/5734bd38-41f2-4294-936b-9a42a1f42b15-kube-api-access-gzxb5\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.491397 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-utilities\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.593404 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-catalog-content\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.593723 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxb5\" (UniqueName: \"kubernetes.io/projected/5734bd38-41f2-4294-936b-9a42a1f42b15-kube-api-access-gzxb5\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.593761 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-utilities\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.593989 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-catalog-content\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.594200 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-utilities\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.617312 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxb5\" (UniqueName: \"kubernetes.io/projected/5734bd38-41f2-4294-936b-9a42a1f42b15-kube-api-access-gzxb5\") pod \"certified-operators-kpwfq\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:39 crc kubenswrapper[4823]: I1216 09:10:39.793383 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.080386 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f251-account-create-update-gmqx7"] Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.120569 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xzsbb"] Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.131176 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f251-account-create-update-gmqx7"] Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.137101 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xzsbb"] Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.384395 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kpwfq"] Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.807241 4823 generic.go:334] "Generic (PLEG): container finished" podID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerID="5fec874649ef4b83bdde07f6a8ad349a25c528884a4c3b45acd7cf88ec42393b" exitCode=0 Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.807296 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerDied","Data":"5fec874649ef4b83bdde07f6a8ad349a25c528884a4c3b45acd7cf88ec42393b"} Dec 16 09:10:40 crc kubenswrapper[4823]: I1216 09:10:40.808505 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerStarted","Data":"2d114f068f15195d104e82659fa0009a2deb4ba3396855559a9a055c7706d147"} Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.035214 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-m8qt2"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.046159 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mg742"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.057653 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2d63-account-create-update-xv7fc"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.068113 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mg742"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.079072 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-m8qt2"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.090621 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2d63-account-create-update-xv7fc"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.104243 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-da3b-account-create-update-lmt62"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.114401 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-da3b-account-create-update-lmt62"] Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.800114 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c28ab01-359f-4508-b5f3-a3847760477b" path="/var/lib/kubelet/pods/0c28ab01-359f-4508-b5f3-a3847760477b/volumes" Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.801693 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372f1ad6-80bd-4662-bf84-5c4bcc44bf05" path="/var/lib/kubelet/pods/372f1ad6-80bd-4662-bf84-5c4bcc44bf05/volumes" Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.802574 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a0e7a5-3a19-4ec3-9e34-be161590ae54" path="/var/lib/kubelet/pods/43a0e7a5-3a19-4ec3-9e34-be161590ae54/volumes" Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.804611 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de9c6b0-50af-4139-b54c-eed47d4d804b" path="/var/lib/kubelet/pods/4de9c6b0-50af-4139-b54c-eed47d4d804b/volumes" Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.806114 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532985ed-811b-4627-80e7-4278a094a3df" path="/var/lib/kubelet/pods/532985ed-811b-4627-80e7-4278a094a3df/volumes" Dec 16 09:10:41 crc kubenswrapper[4823]: I1216 09:10:41.807626 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8305f2ac-41bf-438a-a905-22723822520b" path="/var/lib/kubelet/pods/8305f2ac-41bf-438a-a905-22723822520b/volumes" Dec 16 09:10:42 crc kubenswrapper[4823]: I1216 09:10:42.828205 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerStarted","Data":"9620c856f330327afafc0bb98f8b401a0781058d899170850f9ed9c04f9d6d4b"} Dec 16 09:10:45 crc kubenswrapper[4823]: I1216 09:10:45.855346 4823 generic.go:334] "Generic (PLEG): container finished" podID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerID="9620c856f330327afafc0bb98f8b401a0781058d899170850f9ed9c04f9d6d4b" exitCode=0 Dec 16 09:10:45 crc kubenswrapper[4823]: I1216 09:10:45.855431 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerDied","Data":"9620c856f330327afafc0bb98f8b401a0781058d899170850f9ed9c04f9d6d4b"} Dec 16 09:10:47 crc kubenswrapper[4823]: I1216 09:10:47.879345 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerStarted","Data":"0a25f0423de67df5156010b6e06926e4f3c0ee2ab828bcfa2a75638aea48c1b9"} Dec 16 09:10:47 crc kubenswrapper[4823]: I1216 09:10:47.919456 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kpwfq" podStartSLOduration=2.673391439 podStartE2EDuration="8.919436654s" podCreationTimestamp="2025-12-16 09:10:39 +0000 UTC" firstStartedPulling="2025-12-16 09:10:40.809557883 +0000 UTC m=+8119.298124016" lastFinishedPulling="2025-12-16 09:10:47.055603108 +0000 UTC m=+8125.544169231" observedRunningTime="2025-12-16 09:10:47.911344031 +0000 UTC m=+8126.399910154" watchObservedRunningTime="2025-12-16 09:10:47.919436654 +0000 UTC m=+8126.408002777" Dec 16 09:10:49 crc kubenswrapper[4823]: I1216 09:10:49.794089 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:49 crc kubenswrapper[4823]: I1216 09:10:49.794167 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:50 crc kubenswrapper[4823]: I1216 09:10:50.846049 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kpwfq" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="registry-server" probeResult="failure" output=< Dec 16 09:10:50 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 09:10:50 crc kubenswrapper[4823]: > Dec 16 09:10:51 crc kubenswrapper[4823]: I1216 09:10:51.995363 4823 scope.go:117] "RemoveContainer" containerID="060a0b09006700be9528649380beeefb74fc9d2c640efce061eb39010f961500" Dec 16 09:10:52 crc kubenswrapper[4823]: I1216 09:10:52.025145 4823 scope.go:117] "RemoveContainer" containerID="655e9221caef9a480d953c8348a9e0a12e5b4531ae895118829d538e97651328" Dec 16 09:10:52 crc kubenswrapper[4823]: I1216 09:10:52.102914 4823 scope.go:117] "RemoveContainer" containerID="b5a273eb68382c7731e4aadb66e3a17bbaf04776d1f2ead198cb3317ce05d386" Dec 16 09:10:52 crc kubenswrapper[4823]: I1216 09:10:52.157710 4823 scope.go:117] "RemoveContainer" containerID="8274757ab3baaadb78d1a08566ebe5b72a16a191a1a65c3e49305283e97c88d9" Dec 16 09:10:52 crc kubenswrapper[4823]: I1216 09:10:52.202584 4823 scope.go:117] "RemoveContainer" containerID="a3bbf522e30ba3200c6502976e8dcf2d7d8705e3ff26e042131a180e49743735" Dec 16 09:10:52 crc kubenswrapper[4823]: I1216 09:10:52.244835 4823 scope.go:117] "RemoveContainer" containerID="08dfa15970f1413699d595c2abe355ad1397b3b689d9e7b7983f87b5510ff431" Dec 16 09:10:59 crc kubenswrapper[4823]: I1216 09:10:59.843976 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:10:59 crc kubenswrapper[4823]: I1216 09:10:59.896835 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:11:00 crc kubenswrapper[4823]: I1216 09:11:00.103568 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kpwfq"] Dec 16 09:11:01 crc kubenswrapper[4823]: I1216 09:11:01.005296 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kpwfq" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="registry-server" containerID="cri-o://0a25f0423de67df5156010b6e06926e4f3c0ee2ab828bcfa2a75638aea48c1b9" gracePeriod=2 Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.026259 4823 generic.go:334] "Generic (PLEG): container finished" podID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerID="0a25f0423de67df5156010b6e06926e4f3c0ee2ab828bcfa2a75638aea48c1b9" exitCode=0 Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.026309 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerDied","Data":"0a25f0423de67df5156010b6e06926e4f3c0ee2ab828bcfa2a75638aea48c1b9"} Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.193388 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.390831 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-catalog-content\") pod \"5734bd38-41f2-4294-936b-9a42a1f42b15\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.391013 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-utilities\") pod \"5734bd38-41f2-4294-936b-9a42a1f42b15\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.391125 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxb5\" (UniqueName: \"kubernetes.io/projected/5734bd38-41f2-4294-936b-9a42a1f42b15-kube-api-access-gzxb5\") pod \"5734bd38-41f2-4294-936b-9a42a1f42b15\" (UID: \"5734bd38-41f2-4294-936b-9a42a1f42b15\") " Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.392782 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-utilities" (OuterVolumeSpecName: "utilities") pod "5734bd38-41f2-4294-936b-9a42a1f42b15" (UID: "5734bd38-41f2-4294-936b-9a42a1f42b15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.396816 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5734bd38-41f2-4294-936b-9a42a1f42b15-kube-api-access-gzxb5" (OuterVolumeSpecName: "kube-api-access-gzxb5") pod "5734bd38-41f2-4294-936b-9a42a1f42b15" (UID: "5734bd38-41f2-4294-936b-9a42a1f42b15"). InnerVolumeSpecName "kube-api-access-gzxb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.448441 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5734bd38-41f2-4294-936b-9a42a1f42b15" (UID: "5734bd38-41f2-4294-936b-9a42a1f42b15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.493908 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.493963 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5734bd38-41f2-4294-936b-9a42a1f42b15-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:02 crc kubenswrapper[4823]: I1216 09:11:02.493978 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxb5\" (UniqueName: \"kubernetes.io/projected/5734bd38-41f2-4294-936b-9a42a1f42b15-kube-api-access-gzxb5\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.037543 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kpwfq" event={"ID":"5734bd38-41f2-4294-936b-9a42a1f42b15","Type":"ContainerDied","Data":"2d114f068f15195d104e82659fa0009a2deb4ba3396855559a9a055c7706d147"} Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.037610 4823 scope.go:117] "RemoveContainer" containerID="0a25f0423de67df5156010b6e06926e4f3c0ee2ab828bcfa2a75638aea48c1b9" Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.037746 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kpwfq" Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.056953 4823 scope.go:117] "RemoveContainer" containerID="9620c856f330327afafc0bb98f8b401a0781058d899170850f9ed9c04f9d6d4b" Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.089000 4823 scope.go:117] "RemoveContainer" containerID="5fec874649ef4b83bdde07f6a8ad349a25c528884a4c3b45acd7cf88ec42393b" Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.136379 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kpwfq"] Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.172015 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kpwfq"] Dec 16 09:11:03 crc kubenswrapper[4823]: I1216 09:11:03.782855 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" path="/var/lib/kubelet/pods/5734bd38-41f2-4294-936b-9a42a1f42b15/volumes" Dec 16 09:11:07 crc kubenswrapper[4823]: I1216 09:11:07.061197 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hfdqc"] Dec 16 09:11:07 crc kubenswrapper[4823]: I1216 09:11:07.077642 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hfdqc"] Dec 16 09:11:07 crc kubenswrapper[4823]: I1216 09:11:07.784633 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c20fd5-7afe-45c9-aced-a02954774c3a" path="/var/lib/kubelet/pods/b2c20fd5-7afe-45c9-aced-a02954774c3a/volumes" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.188612 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x9g4f"] Dec 16 09:11:19 crc kubenswrapper[4823]: E1216 09:11:19.189685 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="registry-server" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.189702 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="registry-server" Dec 16 09:11:19 crc kubenswrapper[4823]: E1216 09:11:19.189735 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="extract-content" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.189747 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="extract-content" Dec 16 09:11:19 crc kubenswrapper[4823]: E1216 09:11:19.189770 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="extract-utilities" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.189783 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="extract-utilities" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.190101 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5734bd38-41f2-4294-936b-9a42a1f42b15" containerName="registry-server" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.192013 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.201464 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9g4f"] Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.377445 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbmzc\" (UniqueName: \"kubernetes.io/projected/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-kube-api-access-cbmzc\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.377589 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-utilities\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.377690 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-catalog-content\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.479667 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbmzc\" (UniqueName: \"kubernetes.io/projected/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-kube-api-access-cbmzc\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.479741 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-utilities\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.479781 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-catalog-content\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.480420 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-utilities\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.480448 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-catalog-content\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.497869 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbmzc\" (UniqueName: \"kubernetes.io/projected/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-kube-api-access-cbmzc\") pod \"community-operators-x9g4f\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:19 crc kubenswrapper[4823]: I1216 09:11:19.535920 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:20 crc kubenswrapper[4823]: I1216 09:11:20.094432 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9g4f"] Dec 16 09:11:20 crc kubenswrapper[4823]: I1216 09:11:20.197730 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerStarted","Data":"2e8ba77190cf7ea20907797392e4c4197fce0ecdb4305d2c21c1ceb43dffdd41"} Dec 16 09:11:21 crc kubenswrapper[4823]: I1216 09:11:21.214010 4823 generic.go:334] "Generic (PLEG): container finished" podID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerID="29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b" exitCode=0 Dec 16 09:11:21 crc kubenswrapper[4823]: I1216 09:11:21.214228 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerDied","Data":"29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b"} Dec 16 09:11:21 crc kubenswrapper[4823]: I1216 09:11:21.216917 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:11:23 crc kubenswrapper[4823]: I1216 09:11:23.234568 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerStarted","Data":"c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c"} Dec 16 09:11:26 crc kubenswrapper[4823]: I1216 09:11:26.263953 4823 generic.go:334] "Generic (PLEG): container finished" podID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerID="c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c" exitCode=0 Dec 16 09:11:26 crc kubenswrapper[4823]: I1216 09:11:26.264101 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerDied","Data":"c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c"} Dec 16 09:11:28 crc kubenswrapper[4823]: I1216 09:11:28.048130 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vkmfp"] Dec 16 09:11:28 crc kubenswrapper[4823]: I1216 09:11:28.060550 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vkmfp"] Dec 16 09:11:28 crc kubenswrapper[4823]: I1216 09:11:28.292241 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerStarted","Data":"2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c"} Dec 16 09:11:28 crc kubenswrapper[4823]: I1216 09:11:28.319993 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x9g4f" podStartSLOduration=3.442126406 podStartE2EDuration="9.319973898s" podCreationTimestamp="2025-12-16 09:11:19 +0000 UTC" firstStartedPulling="2025-12-16 09:11:21.21658744 +0000 UTC m=+8159.705153563" lastFinishedPulling="2025-12-16 09:11:27.094434932 +0000 UTC m=+8165.583001055" observedRunningTime="2025-12-16 09:11:28.314843868 +0000 UTC m=+8166.803410001" watchObservedRunningTime="2025-12-16 09:11:28.319973898 +0000 UTC m=+8166.808540021" Dec 16 09:11:29 crc kubenswrapper[4823]: I1216 09:11:29.536857 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:29 crc kubenswrapper[4823]: I1216 09:11:29.536955 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:29 crc kubenswrapper[4823]: I1216 09:11:29.579599 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:29 crc kubenswrapper[4823]: I1216 09:11:29.792451 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b887758-8834-4b33-8569-44799548f791" path="/var/lib/kubelet/pods/4b887758-8834-4b33-8569-44799548f791/volumes" Dec 16 09:11:30 crc kubenswrapper[4823]: I1216 09:11:30.027144 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4d28k"] Dec 16 09:11:30 crc kubenswrapper[4823]: I1216 09:11:30.036714 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4d28k"] Dec 16 09:11:31 crc kubenswrapper[4823]: I1216 09:11:31.785816 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc99ed3d-7cdb-4152-b23e-05096c7dd4cb" path="/var/lib/kubelet/pods/cc99ed3d-7cdb-4152-b23e-05096c7dd4cb/volumes" Dec 16 09:11:39 crc kubenswrapper[4823]: I1216 09:11:39.601309 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:39 crc kubenswrapper[4823]: I1216 09:11:39.659661 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9g4f"] Dec 16 09:11:40 crc kubenswrapper[4823]: I1216 09:11:40.420528 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x9g4f" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="registry-server" containerID="cri-o://2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c" gracePeriod=2 Dec 16 09:11:40 crc kubenswrapper[4823]: I1216 09:11:40.912280 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.057500 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-utilities\") pod \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.057902 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbmzc\" (UniqueName: \"kubernetes.io/projected/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-kube-api-access-cbmzc\") pod \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.058275 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-catalog-content\") pod \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\" (UID: \"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b\") " Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.058734 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-utilities" (OuterVolumeSpecName: "utilities") pod "2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" (UID: "2c8d8f4e-6bbc-41c5-b939-3c55ed16464b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.059021 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.064262 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-kube-api-access-cbmzc" (OuterVolumeSpecName: "kube-api-access-cbmzc") pod "2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" (UID: "2c8d8f4e-6bbc-41c5-b939-3c55ed16464b"). InnerVolumeSpecName "kube-api-access-cbmzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.115075 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" (UID: "2c8d8f4e-6bbc-41c5-b939-3c55ed16464b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.164609 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbmzc\" (UniqueName: \"kubernetes.io/projected/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-kube-api-access-cbmzc\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.164657 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.430759 4823 generic.go:334] "Generic (PLEG): container finished" podID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerID="2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c" exitCode=0 Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.430833 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9g4f" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.430855 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerDied","Data":"2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c"} Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.431534 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9g4f" event={"ID":"2c8d8f4e-6bbc-41c5-b939-3c55ed16464b","Type":"ContainerDied","Data":"2e8ba77190cf7ea20907797392e4c4197fce0ecdb4305d2c21c1ceb43dffdd41"} Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.431665 4823 scope.go:117] "RemoveContainer" containerID="2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.502622 4823 scope.go:117] "RemoveContainer" containerID="c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.508812 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9g4f"] Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.520353 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x9g4f"] Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.526511 4823 scope.go:117] "RemoveContainer" containerID="29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.569557 4823 scope.go:117] "RemoveContainer" containerID="2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c" Dec 16 09:11:41 crc kubenswrapper[4823]: E1216 09:11:41.570209 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c\": container with ID starting with 2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c not found: ID does not exist" containerID="2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.570248 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c"} err="failed to get container status \"2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c\": rpc error: code = NotFound desc = could not find container \"2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c\": container with ID starting with 2d71f3fd684008b2ce901749cdc11e0ffc93a6771dcb557b736d11a2335e931c not found: ID does not exist" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.570274 4823 scope.go:117] "RemoveContainer" containerID="c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c" Dec 16 09:11:41 crc kubenswrapper[4823]: E1216 09:11:41.571905 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c\": container with ID starting with c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c not found: ID does not exist" containerID="c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.571946 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c"} err="failed to get container status \"c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c\": rpc error: code = NotFound desc = could not find container \"c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c\": container with ID starting with c308a7d76029c6c6dec8d0b1486956af1a7c10256352a1dda19d5b66d099397c not found: ID does not exist" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.571972 4823 scope.go:117] "RemoveContainer" containerID="29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b" Dec 16 09:11:41 crc kubenswrapper[4823]: E1216 09:11:41.572508 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b\": container with ID starting with 29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b not found: ID does not exist" containerID="29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.572634 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b"} err="failed to get container status \"29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b\": rpc error: code = NotFound desc = could not find container \"29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b\": container with ID starting with 29a5accd760d403e61c54a4d0e6d34dd2100ed34ef41bfae18ba097f468e768b not found: ID does not exist" Dec 16 09:11:41 crc kubenswrapper[4823]: I1216 09:11:41.784103 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" path="/var/lib/kubelet/pods/2c8d8f4e-6bbc-41c5-b939-3c55ed16464b/volumes" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.338701 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.339236 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" containerName="openstackclient" containerID="cri-o://06a0d06ef2a52866e48ef5baf829653c9eb89594b03bea4aa951dea2c39e0a1c" gracePeriod=2 Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.356211 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.381830 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.382074 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-log" containerID="cri-o://9a93aa1b4c75390a6ef3fa58db9fd87ea0983cf80d04ef79278da7ac5a212dbe" gracePeriod=30 Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.382486 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-httpd" containerID="cri-o://5d6cc389cc0a251a9367e2e3b78544eb67ee1e7cee3e16ec15b4a605c6de77ee" gracePeriod=30 Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.492340 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.616432 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.616490 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data podName:bf14ab2c-212b-406f-b102-2a4b8a7a29f5 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:44.116472599 +0000 UTC m=+8182.605038722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data") pod "rabbitmq-cell1-server-0" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5") : configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.856536 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.919213 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance4d28-account-delete-cb8qx"] Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.919811 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="extract-content" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.919828 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="extract-content" Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.919842 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="registry-server" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.919851 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="registry-server" Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.919869 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" containerName="openstackclient" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.919879 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" containerName="openstackclient" Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.919895 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="extract-utilities" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.919904 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="extract-utilities" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.920202 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8d8f4e-6bbc-41c5-b939-3c55ed16464b" containerName="registry-server" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.920226 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" containerName="openstackclient" Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.921183 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.932619 4823 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Dec 16 09:11:43 crc kubenswrapper[4823]: E1216 09:11:43.932693 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data podName:f8b8d93d-24db-4382-9077-7404605c7cf1 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:44.432677995 +0000 UTC m=+8182.921244118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data") pod "heat-api-ddfd865c7-nhsh6" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1") : secret "heat-config-data" not found Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.953629 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement2f37-account-delete-t22qt"] Dec 16 09:11:43 crc kubenswrapper[4823]: I1216 09:11:43.955122 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:43.995087 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement2f37-account-delete-t22qt"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.029075 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4d28-account-delete-cb8qx"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.032903 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt5ts\" (UniqueName: \"kubernetes.io/projected/3be6f063-aed2-4468-9cd3-f7f03bd28211-kube-api-access-wt5ts\") pod \"placement2f37-account-delete-t22qt\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.032945 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddblq\" (UniqueName: \"kubernetes.io/projected/5729be98-e3b4-42bd-92a6-913d63da1de3-kube-api-access-ddblq\") pod \"glance4d28-account-delete-cb8qx\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.033080 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5729be98-e3b4-42bd-92a6-913d63da1de3-operator-scripts\") pod \"glance4d28-account-delete-cb8qx\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.033120 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3be6f063-aed2-4468-9cd3-f7f03bd28211-operator-scripts\") pod \"placement2f37-account-delete-t22qt\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.035016 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.035087 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data podName:cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:44.53506824 +0000 UTC m=+8183.023634363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data") pod "rabbitmq-server-0" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7") : configmap "rabbitmq-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.073578 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heatf1cb-account-delete-2mdnl"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.075496 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.119674 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heatf1cb-account-delete-2mdnl"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.135139 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6361c12-5d54-4919-aafe-4ac9b88e8c20-operator-scripts\") pod \"heatf1cb-account-delete-2mdnl\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.135217 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt5ts\" (UniqueName: \"kubernetes.io/projected/3be6f063-aed2-4468-9cd3-f7f03bd28211-kube-api-access-wt5ts\") pod \"placement2f37-account-delete-t22qt\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.135247 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddblq\" (UniqueName: \"kubernetes.io/projected/5729be98-e3b4-42bd-92a6-913d63da1de3-kube-api-access-ddblq\") pod \"glance4d28-account-delete-cb8qx\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.135295 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4d4t\" (UniqueName: \"kubernetes.io/projected/d6361c12-5d54-4919-aafe-4ac9b88e8c20-kube-api-access-m4d4t\") pod \"heatf1cb-account-delete-2mdnl\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.135398 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5729be98-e3b4-42bd-92a6-913d63da1de3-operator-scripts\") pod \"glance4d28-account-delete-cb8qx\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.135457 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3be6f063-aed2-4468-9cd3-f7f03bd28211-operator-scripts\") pod \"placement2f37-account-delete-t22qt\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.136545 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3be6f063-aed2-4468-9cd3-f7f03bd28211-operator-scripts\") pod \"placement2f37-account-delete-t22qt\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.137072 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.137125 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data podName:bf14ab2c-212b-406f-b102-2a4b8a7a29f5 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:45.137108483 +0000 UTC m=+8183.625674616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data") pod "rabbitmq-cell1-server-0" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5") : configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.137943 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5729be98-e3b4-42bd-92a6-913d63da1de3-operator-scripts\") pod \"glance4d28-account-delete-cb8qx\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.160197 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.160516 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" containerID="cri-o://d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" gracePeriod=30 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.161077 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="openstack-network-exporter" containerID="cri-o://c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" gracePeriod=30 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.213783 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt5ts\" (UniqueName: \"kubernetes.io/projected/3be6f063-aed2-4468-9cd3-f7f03bd28211-kube-api-access-wt5ts\") pod \"placement2f37-account-delete-t22qt\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.216250 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddblq\" (UniqueName: \"kubernetes.io/projected/5729be98-e3b4-42bd-92a6-913d63da1de3-kube-api-access-ddblq\") pod \"glance4d28-account-delete-cb8qx\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.237965 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6361c12-5d54-4919-aafe-4ac9b88e8c20-operator-scripts\") pod \"heatf1cb-account-delete-2mdnl\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.238061 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4d4t\" (UniqueName: \"kubernetes.io/projected/d6361c12-5d54-4919-aafe-4ac9b88e8c20-kube-api-access-m4d4t\") pod \"heatf1cb-account-delete-2mdnl\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.238661 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6361c12-5d54-4919-aafe-4ac9b88e8c20-operator-scripts\") pod \"heatf1cb-account-delete-2mdnl\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.252076 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronc394-account-delete-lkcfh"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.254707 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.274578 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4d4t\" (UniqueName: \"kubernetes.io/projected/d6361c12-5d54-4919-aafe-4ac9b88e8c20-kube-api-access-m4d4t\") pod \"heatf1cb-account-delete-2mdnl\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.297262 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.318475 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.345350 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b75df4d-61d8-4913-bea9-018339e8e2a8-operator-scripts\") pod \"neutronc394-account-delete-lkcfh\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.345408 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvklf\" (UniqueName: \"kubernetes.io/projected/9b75df4d-61d8-4913-bea9-018339e8e2a8-kube-api-access-kvklf\") pod \"neutronc394-account-delete-lkcfh\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.353601 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronc394-account-delete-lkcfh"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.402583 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.459248 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b75df4d-61d8-4913-bea9-018339e8e2a8-operator-scripts\") pod \"neutronc394-account-delete-lkcfh\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.459643 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvklf\" (UniqueName: \"kubernetes.io/projected/9b75df4d-61d8-4913-bea9-018339e8e2a8-kube-api-access-kvklf\") pod \"neutronc394-account-delete-lkcfh\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.460215 4823 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.460271 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data podName:f8b8d93d-24db-4382-9077-7404605c7cf1 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:45.460250357 +0000 UTC m=+8183.948816470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data") pod "heat-api-ddfd865c7-nhsh6" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1") : secret "heat-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.467959 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b75df4d-61d8-4913-bea9-018339e8e2a8-operator-scripts\") pod \"neutronc394-account-delete-lkcfh\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.520575 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-f9bwg"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.522004 4823 generic.go:334] "Generic (PLEG): container finished" podID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerID="9a93aa1b4c75390a6ef3fa58db9fd87ea0983cf80d04ef79278da7ac5a212dbe" exitCode=143 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.522077 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d06b91f8-1fcd-40fe-b712-0549d99258c6","Type":"ContainerDied","Data":"9a93aa1b4c75390a6ef3fa58db9fd87ea0983cf80d04ef79278da7ac5a212dbe"} Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.523377 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvklf\" (UniqueName: \"kubernetes.io/projected/9b75df4d-61d8-4913-bea9-018339e8e2a8-kube-api-access-kvklf\") pod \"neutronc394-account-delete-lkcfh\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.537754 4823 generic.go:334] "Generic (PLEG): container finished" podID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerID="c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" exitCode=2 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.537798 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"64445002-15b9-4ec6-8c95-7c2bd33e0ecd","Type":"ContainerDied","Data":"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca"} Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.554092 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-f9bwg"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.627828 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.628569 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-1" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="openstack-network-exporter" containerID="cri-o://94d634a132c1bc025be0422b3756b78f98c01350c4a640621b04c9ead2558605" gracePeriod=300 Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.629692 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.629761 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data podName:cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:45.629744582 +0000 UTC m=+8184.118310705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data") pod "rabbitmq-server-0" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7") : configmap "rabbitmq-config-data" not found Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.673140 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.673787 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="openstack-network-exporter" containerID="cri-o://0c5789344ac78b72b59aa20ebab8bdaa03ec7af24691842901db5f6dd86d3f14" gracePeriod=300 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.711115 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.711588 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-2" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="openstack-network-exporter" containerID="cri-o://6ffc44659c4af61247f8ff482db564b84d825032d9d7789049d8414a5dd9a687" gracePeriod=300 Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.767670 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.769064 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.772951 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderea34-account-delete-pjmhc"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.782759 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.817071 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.817509 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-1" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="openstack-network-exporter" containerID="cri-o://bf3c177aa7f060a204b18065f9ace154c320ecfa12044e459e50aad48defa022" gracePeriod=300 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.862304 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.862976 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="openstack-network-exporter" containerID="cri-o://fa3277346b7569acd2a70b5d918fc5eff0d9ac222f0e8c16aa4ecaef31ed032b" gracePeriod=300 Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.866043 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.881717 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderea34-account-delete-pjmhc"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.887640 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9880fe3-977f-473b-84c9-2cb6f65d588d-operator-scripts\") pod \"cinderea34-account-delete-pjmhc\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.887882 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nn9r\" (UniqueName: \"kubernetes.io/projected/d9880fe3-977f-473b-84c9-2cb6f65d588d-kube-api-access-2nn9r\") pod \"cinderea34-account-delete-pjmhc\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.898402 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:44 crc kubenswrapper[4823]: E1216 09:11:44.898467 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.907085 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.907928 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-2" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="openstack-network-exporter" containerID="cri-o://b324eb19678c78b1a0e6949df42fbbb0f0364093edcb56f5eafa24f8062aff0e" gracePeriod=300 Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.942720 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican75d9-account-delete-x7xds"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.944308 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.976075 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell12d63-account-delete-8c88v"] Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.993410 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nps6\" (UniqueName: \"kubernetes.io/projected/1422dc66-68e5-403d-9e01-657d83772587-kube-api-access-2nps6\") pod \"barbican75d9-account-delete-x7xds\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:44 crc kubenswrapper[4823]: I1216 09:11:44.993600 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nn9r\" (UniqueName: \"kubernetes.io/projected/d9880fe3-977f-473b-84c9-2cb6f65d588d-kube-api-access-2nn9r\") pod \"cinderea34-account-delete-pjmhc\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:44.995185 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts\") pod \"barbican75d9-account-delete-x7xds\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:44.995222 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9880fe3-977f-473b-84c9-2cb6f65d588d-operator-scripts\") pod \"cinderea34-account-delete-pjmhc\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.001667 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9880fe3-977f-473b-84c9-2cb6f65d588d-operator-scripts\") pod \"cinderea34-account-delete-pjmhc\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.019939 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican75d9-account-delete-x7xds"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.020058 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.065196 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0da3b-account-delete-2w9zh"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.067525 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.096958 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshbq\" (UniqueName: \"kubernetes.io/projected/4da7ae09-cc1d-4f42-b1be-7045236d12e9-kube-api-access-jshbq\") pod \"novacell12d63-account-delete-8c88v\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.097420 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nps6\" (UniqueName: \"kubernetes.io/projected/1422dc66-68e5-403d-9e01-657d83772587-kube-api-access-2nps6\") pod \"barbican75d9-account-delete-x7xds\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.097490 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts\") pod \"novacell12d63-account-delete-8c88v\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.097736 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts\") pod \"barbican75d9-account-delete-x7xds\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.098686 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts\") pod \"barbican75d9-account-delete-x7xds\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.116166 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell12d63-account-delete-8c88v"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.149529 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nps6\" (UniqueName: \"kubernetes.io/projected/1422dc66-68e5-403d-9e01-657d83772587-kube-api-access-2nps6\") pod \"barbican75d9-account-delete-x7xds\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.150206 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5948ddcb4-f5qgv"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.150481 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5948ddcb4-f5qgv" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon-log" containerID="cri-o://946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52" gracePeriod=30 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.150953 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5948ddcb4-f5qgv" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" containerID="cri-o://609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c" gracePeriod=30 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.152393 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nn9r\" (UniqueName: \"kubernetes.io/projected/d9880fe3-977f-473b-84c9-2cb6f65d588d-kube-api-access-2nn9r\") pod \"cinderea34-account-delete-pjmhc\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.180372 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0da3b-account-delete-2w9zh"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.190507 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="ovsdbserver-sb" containerID="cri-o://81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" gracePeriod=300 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.199193 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-25bwx"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.200391 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22z6t\" (UniqueName: \"kubernetes.io/projected/7835251f-9e66-445c-9581-0422195cdc2b-kube-api-access-22z6t\") pod \"novacell0da3b-account-delete-2w9zh\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.200437 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts\") pod \"novacell0da3b-account-delete-2w9zh\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.200557 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jshbq\" (UniqueName: \"kubernetes.io/projected/4da7ae09-cc1d-4f42-b1be-7045236d12e9-kube-api-access-jshbq\") pod \"novacell12d63-account-delete-8c88v\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.200617 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts\") pod \"novacell12d63-account-delete-8c88v\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.201388 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts\") pod \"novacell12d63-account-delete-8c88v\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.201455 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.201491 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data podName:bf14ab2c-212b-406f-b102-2a4b8a7a29f5 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:47.201478835 +0000 UTC m=+8185.690044958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data") pod "rabbitmq-cell1-server-0" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5") : configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.280492 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshbq\" (UniqueName: \"kubernetes.io/projected/4da7ae09-cc1d-4f42-b1be-7045236d12e9-kube-api-access-jshbq\") pod \"novacell12d63-account-delete-8c88v\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.357618 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22z6t\" (UniqueName: \"kubernetes.io/projected/7835251f-9e66-445c-9581-0422195cdc2b-kube-api-access-22z6t\") pod \"novacell0da3b-account-delete-2w9zh\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.357917 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts\") pod \"novacell0da3b-account-delete-2w9zh\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.364068 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.366437 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-25bwx"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.409957 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts\") pod \"novacell0da3b-account-delete-2w9zh\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.410730 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.423608 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="ovsdbserver-nb" containerID="cri-o://066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6" gracePeriod=300 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.443817 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.511573 4823 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.511862 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data podName:f8b8d93d-24db-4382-9077-7404605c7cf1 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:47.511843709 +0000 UTC m=+8186.000409832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data") pod "heat-api-ddfd865c7-nhsh6" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1") : secret "heat-config-data" not found Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.553698 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22z6t\" (UniqueName: \"kubernetes.io/projected/7835251f-9e66-445c-9581-0422195cdc2b-kube-api-access-22z6t\") pod \"novacell0da3b-account-delete-2w9zh\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.630553 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.644534 4823 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.644697 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts podName:4da7ae09-cc1d-4f42-b1be-7045236d12e9 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:46.144677327 +0000 UTC m=+8184.633243450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts") pod "novacell12d63-account-delete-8c88v" (UID: "4da7ae09-cc1d-4f42-b1be-7045236d12e9") : configmap "openstack-cell1-scripts" not found Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.649063 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapif251-account-delete-dhnjg"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.659687 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.716598 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.716636 4823 generic.go:334] "Generic (PLEG): container finished" podID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerID="fa3277346b7569acd2a70b5d918fc5eff0d9ac222f0e8c16aa4ecaef31ed032b" exitCode=2 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.716664 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05dfc2e3-71af-4150-a4ca-02b5629083ae","Type":"ContainerDied","Data":"fa3277346b7569acd2a70b5d918fc5eff0d9ac222f0e8c16aa4ecaef31ed032b"} Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.737553 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df55b6677-dqvsm"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.737969 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" containerName="dnsmasq-dns" containerID="cri-o://ffe2b7b0714c510ef6a3fa2e5f42cfeebfd6329c0126d803409a205d66f2ddec" gracePeriod=10 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.749229 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxkgf\" (UniqueName: \"kubernetes.io/projected/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-kube-api-access-lxkgf\") pod \"novaapif251-account-delete-dhnjg\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.749364 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts\") pod \"novaapif251-account-delete-dhnjg\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.750044 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.750094 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data podName:cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:47.750079345 +0000 UTC m=+8186.238645468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data") pod "rabbitmq-server-0" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7") : configmap "rabbitmq-config-data" not found Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.788745 4823 generic.go:334] "Generic (PLEG): container finished" podID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerID="94d634a132c1bc025be0422b3756b78f98c01350c4a640621b04c9ead2558605" exitCode=2 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.801775 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.847399 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:45 crc kubenswrapper[4823]: E1216 09:11:45.847468 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="ovsdbserver-sb" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.853870 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02583141-ec44-4216-b06a-43b990053509" path="/var/lib/kubelet/pods/02583141-ec44-4216-b06a-43b990053509/volumes" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.854586 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99d99db-98d9-45ca-ab9e-f8a56b5e936b" path="/var/lib/kubelet/pods/b99d99db-98d9-45ca-ab9e-f8a56b5e936b/volumes" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.855188 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6c99b5e4-de24-426d-9a97-05fdcbe37141","Type":"ContainerDied","Data":"94d634a132c1bc025be0422b3756b78f98c01350c4a640621b04c9ead2558605"} Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.856661 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxkgf\" (UniqueName: \"kubernetes.io/projected/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-kube-api-access-lxkgf\") pod \"novaapif251-account-delete-dhnjg\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.856779 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts\") pod \"novaapif251-account-delete-dhnjg\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.859839 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts\") pod \"novaapif251-account-delete-dhnjg\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.893894 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxkgf\" (UniqueName: \"kubernetes.io/projected/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-kube-api-access-lxkgf\") pod \"novaapif251-account-delete-dhnjg\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.894491 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapif251-account-delete-dhnjg"] Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.937332 4823 generic.go:334] "Generic (PLEG): container finished" podID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerID="6ffc44659c4af61247f8ff482db564b84d825032d9d7789049d8414a5dd9a687" exitCode=2 Dec 16 09:11:45 crc kubenswrapper[4823]: I1216 09:11:45.937460 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"dc75b889-6dc5-462d-a589-50f705ffd78f","Type":"ContainerDied","Data":"6ffc44659c4af61247f8ff482db564b84d825032d9d7789049d8414a5dd9a687"} Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.003771 4823 generic.go:334] "Generic (PLEG): container finished" podID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerID="0c5789344ac78b72b59aa20ebab8bdaa03ec7af24691842901db5f6dd86d3f14" exitCode=2 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.003917 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8","Type":"ContainerDied","Data":"0c5789344ac78b72b59aa20ebab8bdaa03ec7af24691842901db5f6dd86d3f14"} Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.006601 4823 generic.go:334] "Generic (PLEG): container finished" podID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerID="bf3c177aa7f060a204b18065f9ace154c320ecfa12044e459e50aad48defa022" exitCode=2 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.006631 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7e1d3682-8130-4fa4-aab4-ade2ac069d2e","Type":"ContainerDied","Data":"bf3c177aa7f060a204b18065f9ace154c320ecfa12044e459e50aad48defa022"} Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.049067 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vn5cc"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.052101 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.073814 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vn5cc"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.104295 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodhf38e-account-delete-hxrkv"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.105770 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.143764 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7454ff977b-h6fwh"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.144186 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7454ff977b-h6fwh" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-log" containerID="cri-o://4e552140d312fcdfa52ae99bb54947c323a559bd5e4b943aed566e48f1890450" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.144529 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7454ff977b-h6fwh" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-api" containerID="cri-o://651f28a2c721b5b4308bee72f9032a131e2c7f4a064121891960b81b54b65133" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.185688 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts\") pod \"aodhf38e-account-delete-hxrkv\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.185848 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d568s\" (UniqueName: \"kubernetes.io/projected/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-kube-api-access-d568s\") pod \"aodhf38e-account-delete-hxrkv\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: E1216 09:11:46.186853 4823 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 16 09:11:46 crc kubenswrapper[4823]: E1216 09:11:46.186938 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts podName:4da7ae09-cc1d-4f42-b1be-7045236d12e9 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:47.186919547 +0000 UTC m=+8185.675485670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts") pod "novacell12d63-account-delete-8c88v" (UID: "4da7ae09-cc1d-4f42-b1be-7045236d12e9") : configmap "openstack-cell1-scripts" not found Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.231169 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodhf38e-account-delete-hxrkv"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.288800 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d568s\" (UniqueName: \"kubernetes.io/projected/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-kube-api-access-d568s\") pod \"aodhf38e-account-delete-hxrkv\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.289200 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts\") pod \"aodhf38e-account-delete-hxrkv\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.289899 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts\") pod \"aodhf38e-account-delete-hxrkv\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.299352 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.318922 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.319268 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-log" containerID="cri-o://ae7cf328f2dddbc80841007ae8ef6edc83650ff4a0d553d7b2dca17acae597a1" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.319444 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-httpd" containerID="cri-o://4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.343302 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ffc876c99-shbwd"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.343528 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ffc876c99-shbwd" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-api" containerID="cri-o://bc5f650dbf19a065a416224d2c46c8451ed1939c757afbeb47b34a826f25043b" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.343860 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ffc876c99-shbwd" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-httpd" containerID="cri-o://be736c54eb1998c9f331d4dd1c7970f56f4f491d0243de972bd6f9e630a78177" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.369325 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5bbf6c4b7b-7qpq6"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.369645 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-httpd" containerID="cri-o://d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.370122 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-server" containerID="cri-o://57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.378201 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d568s\" (UniqueName: \"kubernetes.io/projected/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-kube-api-access-d568s\") pod \"aodhf38e-account-delete-hxrkv\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.379149 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerName="rabbitmq" containerID="cri-o://aeb46928562b9e0657f49ff73daa201ffbf7d9ddbfda61724d79baa281b28aab" gracePeriod=604800 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.387667 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.387980 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="cinder-scheduler" containerID="cri-o://03aaea60579a32dbd22e959a4c109e38b799c758c8b0d9ef37082c0af8297906" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.388136 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="probe" containerID="cri-o://25018dc3f33bf4fbcc5228605d9875ddf4805fa913e97070453b8841fe915d79" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.408210 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-hm4lm"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.419754 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.420132 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-api" containerID="cri-o://edfff901e422667feb8df9942487dc99f3781c67ee58c02cc9599524e02a462f" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.420535 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-listener" containerID="cri-o://8d06e3290f385b94ea1f27ba2ac87da8630c1786706af2c1115d30b9f2ec0dd7" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.420583 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-notifier" containerID="cri-o://34e5dedd3f4eeec3f9dfff67c9bc1eb3a9095430a483155bc57be31d63b3460b" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.420617 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-evaluator" containerID="cri-o://69a1928930a4c71ac0b712b8fdbc30fc7cd0ea0a10daa4d5d941a2057c2caf2e" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.444923 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.496496 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-hm4lm"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.525002 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.525288 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api-log" containerID="cri-o://30fdfe487e62f583768c370c7e485f339f7b652c87130a0e97f5217998c63ec1" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.525416 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api" containerID="cri-o://77f7563a34f0a287066ce2a8f04f92118c66c8f3bccb6cfd97b6587b049219b8" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.541181 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.550809 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-64f85d9856-wwkd5"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.551075 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" containerID="cri-o://6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" gracePeriod=60 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.559116 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.559520 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" containerID="cri-o://ab5e8a8e527a7f55ae463d150528bea80ab575a8ef40cde1febdcfd7069b95a0" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.559708 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" containerID="cri-o://7ac81f3d4f6c3b985e28f223f6d2d8ddf14deedc2445d7c0e81bfa4724f713b5" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.581121 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.604277 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-ddfd865c7-nhsh6"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.604567 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-ddfd865c7-nhsh6" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerName="heat-api" containerID="cri-o://9e972ac6360c9fcb45fd20ef40bf7e8972136fa235df75bc1dfcacbfb25e23ed" gracePeriod=60 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.621532 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-8589448fc-qj569"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.621843 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-8589448fc-qj569" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerName="heat-cfnapi" containerID="cri-o://44afd549f065376806e3735489d6257a4793e59063b189217a6eecd50e0f1af0" gracePeriod=60 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.658261 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerName="rabbitmq" containerID="cri-o://49741b7980cd55e4afdfdbd68688aadb6380c0d69a35239bfa62de2454502776" gracePeriod=604800 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.683446 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.683736 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-log" containerID="cri-o://d0ed8363af4a48c1ad2fe42fbd2a98b00aaad8af5f9c4a1438b6a7c118165062" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.684300 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-api" containerID="cri-o://cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.691565 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell12d63-account-delete-8c88v"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.704461 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-865d4cf8d6-bwj5n"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.704728 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener-log" containerID="cri-o://6153e939f78f8727294cd18b744eafe21fd2960278b20c36c46e082697f211e2" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.704806 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener" containerID="cri-o://721ddb3d721e21d50c0be2952ef296f0188553dcfea31e2f3a2d25c394c2d3b6" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.763883 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d656d958d-tmzmp"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.764232 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d656d958d-tmzmp" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api-log" containerID="cri-o://c4f509080a9f88ea8968e728a43f48daa0e137e74d01c40c44bdd031bebe8a40" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.764902 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d656d958d-tmzmp" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api" containerID="cri-o://9a306cbeecf35df7308d1553cc064c30c8abbe4a6a369ff751b3831a552d0f27" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.794164 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.794411 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.819450 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-fcf4dff7-84zz6"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.819737 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-fcf4dff7-84zz6" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker-log" containerID="cri-o://a872bdca1a55985b613b2e9d1e9a92fa37fb0ac195eba280c9f758c50de98937" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.820190 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-fcf4dff7-84zz6" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker" containerID="cri-o://43f2f25511680e01631c9fea0525d1784511e8f0fbc8bdc295206a3b91483591" gracePeriod=30 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.835608 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.835967 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/alertmanager-metric-storage-0" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="alertmanager" containerID="cri-o://6dcfd48a9401ea537616a16a619f9cf8397493228a632419e0b26b39624f0619" gracePeriod=120 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.836135 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/alertmanager-metric-storage-0" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="config-reloader" containerID="cri-o://96fcd04f838b04b2f8450fa0fddd75849e66aca34b0de130673596b80ab9b02e" gracePeriod=120 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.955201 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.955940 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="prometheus" containerID="cri-o://89a2670cf718ab31b266c6424bfc66b8bfcac5fd22e626dc8ba58e5549ee3781" gracePeriod=600 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.956630 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="thanos-sidecar" containerID="cri-o://b67bbe0ab1f251a9ee41d54b3f9217494a588f2c6e81c715f6504ef1a69b0fb0" gracePeriod=600 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.956701 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="config-reloader" containerID="cri-o://be73fd31aef3c2646ca8ea8d7a1185806913357be9eccd871881753164e9eaff" gracePeriod=600 Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.992648 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:11:46 crc kubenswrapper[4823]: I1216 09:11:46.992981 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6f60cf52-47f0-4efd-8479-64bcc13848cf" containerName="nova-scheduler-scheduler" containerID="cri-o://417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d" gracePeriod=30 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.049319 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.049717 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" gracePeriod=30 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.079153 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.079515 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerName="nova-cell1-conductor-conductor" containerID="cri-o://f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" gracePeriod=30 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.089075 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2f37-account-delete-t22qt" event={"ID":"3be6f063-aed2-4468-9cd3-f7f03bd28211","Type":"ContainerStarted","Data":"8105dba2683062040a896c190887f2594a024cde0b9aa17865595c93ede931ce"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.109325 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.1.137:9090/-/ready\": dial tcp 10.217.1.137:9090: connect: connection refused" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.109836 4823 generic.go:334] "Generic (PLEG): container finished" podID="83abe53b-780a-4255-b2a8-22f3480c9358" containerID="be736c54eb1998c9f331d4dd1c7970f56f4f491d0243de972bd6f9e630a78177" exitCode=0 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.109942 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffc876c99-shbwd" event={"ID":"83abe53b-780a-4255-b2a8-22f3480c9358","Type":"ContainerDied","Data":"be736c54eb1998c9f331d4dd1c7970f56f4f491d0243de972bd6f9e630a78177"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.117164 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement2f37-account-delete-t22qt"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.136895 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heatf1cb-account-delete-2mdnl"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.170059 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" containerID="cri-o://8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a" gracePeriod=30 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.218724 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.97:6080/vnc_lite.html\": dial tcp 10.217.1.97:6080: connect: connection refused" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.219101 4823 generic.go:334] "Generic (PLEG): container finished" podID="99ce1c86-eccc-4f3c-b999-18774e823763" containerID="ffe2b7b0714c510ef6a3fa2e5f42cfeebfd6329c0126d803409a205d66f2ddec" exitCode=0 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.219221 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" event={"ID":"99ce1c86-eccc-4f3c-b999-18774e823763","Type":"ContainerDied","Data":"ffe2b7b0714c510ef6a3fa2e5f42cfeebfd6329c0126d803409a205d66f2ddec"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.255711 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4d28-account-delete-cb8qx"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.281685 4823 generic.go:334] "Generic (PLEG): container finished" podID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerID="c4f509080a9f88ea8968e728a43f48daa0e137e74d01c40c44bdd031bebe8a40" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.281803 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d656d958d-tmzmp" event={"ID":"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec","Type":"ContainerDied","Data":"c4f509080a9f88ea8968e728a43f48daa0e137e74d01c40c44bdd031bebe8a40"} Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.287549 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.287710 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data podName:bf14ab2c-212b-406f-b102-2a4b8a7a29f5 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:51.287622827 +0000 UTC m=+8189.776188950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data") pod "rabbitmq-cell1-server-0" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5") : configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.288020 4823 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.288058 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts podName:4da7ae09-cc1d-4f42-b1be-7045236d12e9 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:49.288049961 +0000 UTC m=+8187.776616084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts") pod "novacell12d63-account-delete-8c88v" (UID: "4da7ae09-cc1d-4f42-b1be-7045236d12e9") : configmap "openstack-cell1-scripts" not found Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.299539 4823 generic.go:334] "Generic (PLEG): container finished" podID="60956cfa-c484-445d-af87-52713ccf4d09" containerID="ae7cf328f2dddbc80841007ae8ef6edc83650ff4a0d553d7b2dca17acae597a1" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.299608 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60956cfa-c484-445d-af87-52713ccf4d09","Type":"ContainerDied","Data":"ae7cf328f2dddbc80841007ae8ef6edc83650ff4a0d553d7b2dca17acae597a1"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.338058 4823 generic.go:334] "Generic (PLEG): container finished" podID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerID="4e552140d312fcdfa52ae99bb54947c323a559bd5e4b943aed566e48f1890450" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.338145 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7454ff977b-h6fwh" event={"ID":"44c54ba6-36e8-4608-ab54-965ab4bdcef2","Type":"ContainerDied","Data":"4e552140d312fcdfa52ae99bb54947c323a559bd5e4b943aed566e48f1890450"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.373289 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8/ovsdbserver-sb/0.log" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.373349 4823 generic.go:334] "Generic (PLEG): container finished" podID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.373404 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8","Type":"ContainerDied","Data":"81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.384689 4823 generic.go:334] "Generic (PLEG): container finished" podID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerID="6153e939f78f8727294cd18b744eafe21fd2960278b20c36c46e082697f211e2" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.384758 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" event={"ID":"7a50033a-9a6e-42e3-ac23-de2a24654b0f","Type":"ContainerDied","Data":"6153e939f78f8727294cd18b744eafe21fd2960278b20c36c46e082697f211e2"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.400500 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_05dfc2e3-71af-4150-a4ca-02b5629083ae/ovsdbserver-nb/0.log" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.400540 4823 generic.go:334] "Generic (PLEG): container finished" podID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerID="066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.400610 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05dfc2e3-71af-4150-a4ca-02b5629083ae","Type":"ContainerDied","Data":"066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.437226 4823 generic.go:334] "Generic (PLEG): container finished" podID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerID="ab5e8a8e527a7f55ae463d150528bea80ab575a8ef40cde1febdcfd7069b95a0" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.437285 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a","Type":"ContainerDied","Data":"ab5e8a8e527a7f55ae463d150528bea80ab575a8ef40cde1febdcfd7069b95a0"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.448529 4823 generic.go:334] "Generic (PLEG): container finished" podID="6aac7bd9-5925-4c54-b747-57320a350ab9" containerID="06a0d06ef2a52866e48ef5baf829653c9eb89594b03bea4aa951dea2c39e0a1c" exitCode=137 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.456553 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heatf1cb-account-delete-2mdnl" event={"ID":"d6361c12-5d54-4919-aafe-4ac9b88e8c20","Type":"ContainerStarted","Data":"366cc0daa1c0f0f30d057976cd5ee4c1587a201ce4e55778ce68198186eb5825"} Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.462260 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.476811 4823 generic.go:334] "Generic (PLEG): container finished" podID="91f5097e-d643-4598-9d06-39f14f913291" containerID="30fdfe487e62f583768c370c7e485f339f7b652c87130a0e97f5217998c63ec1" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.477180 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91f5097e-d643-4598-9d06-39f14f913291","Type":"ContainerDied","Data":"30fdfe487e62f583768c370c7e485f339f7b652c87130a0e97f5217998c63ec1"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.484510 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a40068b-87bc-4af6-862d-ad33696041b3" containerID="d0ed8363af4a48c1ad2fe42fbd2a98b00aaad8af5f9c4a1438b6a7c118165062" exitCode=143 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.484723 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a40068b-87bc-4af6-862d-ad33696041b3","Type":"ContainerDied","Data":"d0ed8363af4a48c1ad2fe42fbd2a98b00aaad8af5f9c4a1438b6a7c118165062"} Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.488564 4823 generic.go:334] "Generic (PLEG): container finished" podID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerID="b324eb19678c78b1a0e6949df42fbbb0f0364093edcb56f5eafa24f8062aff0e" exitCode=2 Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.488774 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"6353e69a-5c31-41c9-9d05-2b958aa6a79f","Type":"ContainerDied","Data":"b324eb19678c78b1a0e6949df42fbbb0f0364093edcb56f5eafa24f8062aff0e"} Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.491440 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.497186 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.497260 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerName="nova-cell0-conductor-conductor" Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.610810 4823 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.611129 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data podName:f8b8d93d-24db-4382-9077-7404605c7cf1 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:51.611104601 +0000 UTC m=+8190.099670724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data") pod "heat-api-ddfd865c7-nhsh6" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1") : secret "heat-config-data" not found Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.682138 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6 is running failed: container process not found" containerID="066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.685260 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6 is running failed: container process not found" containerID="066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.688977 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6 is running failed: container process not found" containerID="066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.689200 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="ovsdbserver-nb" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.740330 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronc394-account-delete-lkcfh"] Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.816142 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 09:11:47 crc kubenswrapper[4823]: E1216 09:11:47.816204 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data podName:cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:51.81618734 +0000 UTC m=+8190.304753463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data") pod "rabbitmq-server-0" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7") : configmap "rabbitmq-config-data" not found Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.828464 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e34f974-7d80-436a-b8e6-68faa3b7db70" path="/var/lib/kubelet/pods/5e34f974-7d80-436a-b8e6-68faa3b7db70/volumes" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.829499 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d686a815-6ed0-4dbc-bccb-eae76386b548" path="/var/lib/kubelet/pods/d686a815-6ed0-4dbc-bccb-eae76386b548/volumes" Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.830191 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderea34-account-delete-pjmhc"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.871773 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell12d63-account-delete-8c88v"] Dec 16 09:11:47 crc kubenswrapper[4823]: I1216 09:11:47.920160 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0da3b-account-delete-2w9zh"] Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.101926 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.123017 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config-secret\") pod \"6aac7bd9-5925-4c54-b747-57320a350ab9\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.123066 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config\") pod \"6aac7bd9-5925-4c54-b747-57320a350ab9\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.123186 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-combined-ca-bundle\") pod \"6aac7bd9-5925-4c54-b747-57320a350ab9\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.123331 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpfd9\" (UniqueName: \"kubernetes.io/projected/6aac7bd9-5925-4c54-b747-57320a350ab9-kube-api-access-fpfd9\") pod \"6aac7bd9-5925-4c54-b747-57320a350ab9\" (UID: \"6aac7bd9-5925-4c54-b747-57320a350ab9\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.123440 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.190673 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aac7bd9-5925-4c54-b747-57320a350ab9-kube-api-access-fpfd9" (OuterVolumeSpecName: "kube-api-access-fpfd9") pod "6aac7bd9-5925-4c54-b747-57320a350ab9" (UID: "6aac7bd9-5925-4c54-b747-57320a350ab9"). InnerVolumeSpecName "kube-api-access-fpfd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.225866 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-sb\") pod \"99ce1c86-eccc-4f3c-b999-18774e823763\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.225928 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmjx5\" (UniqueName: \"kubernetes.io/projected/99ce1c86-eccc-4f3c-b999-18774e823763-kube-api-access-gmjx5\") pod \"99ce1c86-eccc-4f3c-b999-18774e823763\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.226085 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-dns-svc\") pod \"99ce1c86-eccc-4f3c-b999-18774e823763\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.226285 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-nb\") pod \"99ce1c86-eccc-4f3c-b999-18774e823763\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.226321 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-config\") pod \"99ce1c86-eccc-4f3c-b999-18774e823763\" (UID: \"99ce1c86-eccc-4f3c-b999-18774e823763\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.227077 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpfd9\" (UniqueName: \"kubernetes.io/projected/6aac7bd9-5925-4c54-b747-57320a350ab9-kube-api-access-fpfd9\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.251527 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_05dfc2e3-71af-4150-a4ca-02b5629083ae/ovsdbserver-nb/0.log" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.251622 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.269217 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ce1c86-eccc-4f3c-b999-18774e823763-kube-api-access-gmjx5" (OuterVolumeSpecName: "kube-api-access-gmjx5") pod "99ce1c86-eccc-4f3c-b999-18774e823763" (UID: "99ce1c86-eccc-4f3c-b999-18774e823763"). InnerVolumeSpecName "kube-api-access-gmjx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.293558 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican75d9-account-delete-x7xds"] Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.299222 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapif251-account-delete-dhnjg"] Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.331418 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.331475 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdb-rundir\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.331569 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-config\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.331676 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-scripts\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.332354 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-metrics-certs-tls-certs\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.332448 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdbserver-nb-tls-certs\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.332525 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-combined-ca-bundle\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.332638 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rslzb\" (UniqueName: \"kubernetes.io/projected/05dfc2e3-71af-4150-a4ca-02b5629083ae-kube-api-access-rslzb\") pod \"05dfc2e3-71af-4150-a4ca-02b5629083ae\" (UID: \"05dfc2e3-71af-4150-a4ca-02b5629083ae\") " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.333702 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmjx5\" (UniqueName: \"kubernetes.io/projected/99ce1c86-eccc-4f3c-b999-18774e823763-kube-api-access-gmjx5\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.332769 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.333318 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-config" (OuterVolumeSpecName: "config") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.334762 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-scripts" (OuterVolumeSpecName: "scripts") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.413947 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05dfc2e3-71af-4150-a4ca-02b5629083ae-kube-api-access-rslzb" (OuterVolumeSpecName: "kube-api-access-rslzb") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "kube-api-access-rslzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.436981 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rslzb\" (UniqueName: \"kubernetes.io/projected/05dfc2e3-71af-4150-a4ca-02b5629083ae-kube-api-access-rslzb\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.437009 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.437019 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.437040 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05dfc2e3-71af-4150-a4ca-02b5629083ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.505009 4823 generic.go:334] "Generic (PLEG): container finished" podID="6f60cf52-47f0-4efd-8479-64bcc13848cf" containerID="417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.505076 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f60cf52-47f0-4efd-8479-64bcc13848cf","Type":"ContainerDied","Data":"417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.507560 4823 generic.go:334] "Generic (PLEG): container finished" podID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerID="5d6cc389cc0a251a9367e2e3b78544eb67ee1e7cee3e16ec15b4a605c6de77ee" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.507603 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d06b91f8-1fcd-40fe-b712-0549d99258c6","Type":"ContainerDied","Data":"5d6cc389cc0a251a9367e2e3b78544eb67ee1e7cee3e16ec15b4a605c6de77ee"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.509728 4823 generic.go:334] "Generic (PLEG): container finished" podID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerID="57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.509762 4823 generic.go:334] "Generic (PLEG): container finished" podID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerID="d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.509801 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" event={"ID":"2d676c2b-8cf1-4933-8f2b-641733d096fc","Type":"ContainerDied","Data":"57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.509833 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" event={"ID":"2d676c2b-8cf1-4933-8f2b-641733d096fc","Type":"ContainerDied","Data":"d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.516477 4823 generic.go:334] "Generic (PLEG): container finished" podID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" containerID="e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.516537 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"868b7d1a-5039-4d72-9a41-d8e57b1df5d4","Type":"ContainerDied","Data":"e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.522274 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell12d63-account-delete-8c88v" event={"ID":"4da7ae09-cc1d-4f42-b1be-7045236d12e9","Type":"ContainerStarted","Data":"4c0915f59df3483bd25df0c2cabf92b6f536576bc00a55bf3ef588b9efc4843a"} Dec 16 09:11:48 crc kubenswrapper[4823]: W1216 09:11:48.526334 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1422dc66_68e5_403d_9e01_657d83772587.slice/crio-a743c22038bc13e5b952d64ec76e34d2c84b9885987049e0b038a31e63335500 WatchSource:0}: Error finding container a743c22038bc13e5b952d64ec76e34d2c84b9885987049e0b038a31e63335500: Status 404 returned error can't find the container with id a743c22038bc13e5b952d64ec76e34d2c84b9885987049e0b038a31e63335500 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.526847 4823 generic.go:334] "Generic (PLEG): container finished" podID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerID="b67bbe0ab1f251a9ee41d54b3f9217494a588f2c6e81c715f6504ef1a69b0fb0" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.526865 4823 generic.go:334] "Generic (PLEG): container finished" podID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerID="be73fd31aef3c2646ca8ea8d7a1185806913357be9eccd871881753164e9eaff" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.526873 4823 generic.go:334] "Generic (PLEG): container finished" podID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerID="89a2670cf718ab31b266c6424bfc66b8bfcac5fd22e626dc8ba58e5549ee3781" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.526911 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerDied","Data":"b67bbe0ab1f251a9ee41d54b3f9217494a588f2c6e81c715f6504ef1a69b0fb0"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.526931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerDied","Data":"be73fd31aef3c2646ca8ea8d7a1185806913357be9eccd871881753164e9eaff"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.526943 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerDied","Data":"89a2670cf718ab31b266c6424bfc66b8bfcac5fd22e626dc8ba58e5549ee3781"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.532378 4823 generic.go:334] "Generic (PLEG): container finished" podID="3be6f063-aed2-4468-9cd3-f7f03bd28211" containerID="bd3081ad4ea308f342c6b6e58cfb9d12ea558645b16ba2e2d7f87a7c18127a34" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.532434 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2f37-account-delete-t22qt" event={"ID":"3be6f063-aed2-4468-9cd3-f7f03bd28211","Type":"ContainerDied","Data":"bd3081ad4ea308f342c6b6e58cfb9d12ea558645b16ba2e2d7f87a7c18127a34"} Dec 16 09:11:48 crc kubenswrapper[4823]: W1216 09:11:48.532671 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25a0f697_45ab_48cd_b4e2_d5e8bcd3b725.slice/crio-c69cb34cb1b02d5fa8a3c6c2a83bc20184104220c1e0d57d6f0b8c7b9d0cf1d9 WatchSource:0}: Error finding container c69cb34cb1b02d5fa8a3c6c2a83bc20184104220c1e0d57d6f0b8c7b9d0cf1d9: Status 404 returned error can't find the container with id c69cb34cb1b02d5fa8a3c6c2a83bc20184104220c1e0d57d6f0b8c7b9d0cf1d9 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.540398 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0da3b-account-delete-2w9zh" event={"ID":"7835251f-9e66-445c-9581-0422195cdc2b","Type":"ContainerStarted","Data":"f890c4c2314b798dc974513df5541e8952b1552ebd64dd80204df16fbde9f3c3"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.544914 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_05dfc2e3-71af-4150-a4ca-02b5629083ae/ovsdbserver-nb/0.log" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.545181 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.545234 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"05dfc2e3-71af-4150-a4ca-02b5629083ae","Type":"ContainerDied","Data":"dc880561e1a9fedf478eafe6742eadbdd3dc73c314f10d410548e6de4c546ea9"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.545307 4823 scope.go:117] "RemoveContainer" containerID="fa3277346b7569acd2a70b5d918fc5eff0d9ac222f0e8c16aa4ecaef31ed032b" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.560914 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "pvc-a528e881-7b8b-4172-b241-4b700d70fcaf". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.569558 4823 generic.go:334] "Generic (PLEG): container finished" podID="34b62d72-52bc-4a7d-806c-52784476a695" containerID="69a1928930a4c71ac0b712b8fdbc30fc7cd0ea0a10daa4d5d941a2057c2caf2e" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.569594 4823 generic.go:334] "Generic (PLEG): container finished" podID="34b62d72-52bc-4a7d-806c-52784476a695" containerID="edfff901e422667feb8df9942487dc99f3781c67ee58c02cc9599524e02a462f" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.569646 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerDied","Data":"69a1928930a4c71ac0b712b8fdbc30fc7cd0ea0a10daa4d5d941a2057c2caf2e"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.569670 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerDied","Data":"edfff901e422667feb8df9942487dc99f3781c67ee58c02cc9599524e02a462f"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.580855 4823 generic.go:334] "Generic (PLEG): container finished" podID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerID="25018dc3f33bf4fbcc5228605d9875ddf4805fa913e97070453b8841fe915d79" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.580946 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cffdbd32-0155-4dd0-897d-9e406fd5e2ee","Type":"ContainerDied","Data":"25018dc3f33bf4fbcc5228605d9875ddf4805fa913e97070453b8841fe915d79"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.592757 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronc394-account-delete-lkcfh" event={"ID":"9b75df4d-61d8-4913-bea9-018339e8e2a8","Type":"ContainerStarted","Data":"8a0efda60bcc4f8c2ac4c1cbe3bed62a6fa15174b36b9100746470ae259f0204"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.607576 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerID="96fcd04f838b04b2f8450fa0fddd75849e66aca34b0de130673596b80ab9b02e" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.607603 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerID="6dcfd48a9401ea537616a16a619f9cf8397493228a632419e0b26b39624f0619" exitCode=0 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.607660 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerDied","Data":"96fcd04f838b04b2f8450fa0fddd75849e66aca34b0de130673596b80ab9b02e"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.607685 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerDied","Data":"6dcfd48a9401ea537616a16a619f9cf8397493228a632419e0b26b39624f0619"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.623443 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutronc394-account-delete-lkcfh" podStartSLOduration=4.6234258950000005 podStartE2EDuration="4.623425895s" podCreationTimestamp="2025-12-16 09:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:11:48.614365981 +0000 UTC m=+8187.102932104" watchObservedRunningTime="2025-12-16 09:11:48.623425895 +0000 UTC m=+8187.111992018" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.645504 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") on node \"crc\" " Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.646793 4823 generic.go:334] "Generic (PLEG): container finished" podID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerID="a872bdca1a55985b613b2e9d1e9a92fa37fb0ac195eba280c9f758c50de98937" exitCode=143 Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.647259 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fcf4dff7-84zz6" event={"ID":"341f00a5-410a-4656-876e-a6b0cfe2a4df","Type":"ContainerDied","Data":"a872bdca1a55985b613b2e9d1e9a92fa37fb0ac195eba280c9f758c50de98937"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.663329 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderea34-account-delete-pjmhc" event={"ID":"d9880fe3-977f-473b-84c9-2cb6f65d588d","Type":"ContainerStarted","Data":"654c4d8ede8217ced8f3e0dfc87eb2327a0e63180f5fd8d72925ccda0f665bb4"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.667404 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.687316 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" event={"ID":"99ce1c86-eccc-4f3c-b999-18774e823763","Type":"ContainerDied","Data":"c2a7d0f644a83a4facecf3e0bca015ccbeff1cf0c10e71185053b15f2c260972"} Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.687337 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df55b6677-dqvsm" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.705977 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinderea34-account-delete-pjmhc" podStartSLOduration=4.705938806 podStartE2EDuration="4.705938806s" podCreationTimestamp="2025-12-16 09:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:11:48.687370116 +0000 UTC m=+8187.175936239" watchObservedRunningTime="2025-12-16 09:11:48.705938806 +0000 UTC m=+8187.194504929" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.712331 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4d28-account-delete-cb8qx" event={"ID":"5729be98-e3b4-42bd-92a6-913d63da1de3","Type":"ContainerStarted","Data":"1c350b1494002b5a30d77ad31609f784399a826474e865f7bd12ac6cc3e1aa4c"} Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.790551 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.797368 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.798048 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6aac7bd9-5925-4c54-b747-57320a350ab9" (UID: "6aac7bd9-5925-4c54-b747-57320a350ab9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.810828 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a is running failed: container process not found" containerID="8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.811218 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a is running failed: container process not found" containerID="8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.815314 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a is running failed: container process not found" containerID="8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.815347 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.817451 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.817488 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerName="nova-cell1-conductor-conductor" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.853274 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.891497 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d is running failed: container process not found" containerID="417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.892504 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d is running failed: container process not found" containerID="417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.892952 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d is running failed: container process not found" containerID="417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 09:11:48 crc kubenswrapper[4823]: E1216 09:11:48.893045 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6f60cf52-47f0-4efd-8479-64bcc13848cf" containerName="nova-scheduler-scheduler" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.933574 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aac7bd9-5925-4c54-b747-57320a350ab9" (UID: "6aac7bd9-5925-4c54-b747-57320a350ab9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.955559 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.990807 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:48 crc kubenswrapper[4823]: I1216 09:11:48.991046 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a528e881-7b8b-4172-b241-4b700d70fcaf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf") on node "crc" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.058314 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a528e881-7b8b-4172-b241-4b700d70fcaf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.111291 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-config" (OuterVolumeSpecName: "config") pod "99ce1c86-eccc-4f3c-b999-18774e823763" (UID: "99ce1c86-eccc-4f3c-b999-18774e823763"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.142429 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.142536 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6aac7bd9-5925-4c54-b747-57320a350ab9" (UID: "6aac7bd9-5925-4c54-b747-57320a350ab9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.160477 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.160509 4823 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6aac7bd9-5925-4c54-b747-57320a350ab9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.160522 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.168725 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99ce1c86-eccc-4f3c-b999-18774e823763" (UID: "99ce1c86-eccc-4f3c-b999-18774e823763"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.187722 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99ce1c86-eccc-4f3c-b999-18774e823763" (UID: "99ce1c86-eccc-4f3c-b999-18774e823763"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.238835 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99ce1c86-eccc-4f3c-b999-18774e823763" (UID: "99ce1c86-eccc-4f3c-b999-18774e823763"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.265201 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.266294 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.266327 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.266337 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.266354 4823 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ce1c86-eccc-4f3c-b999-18774e823763-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.337202 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "05dfc2e3-71af-4150-a4ca-02b5629083ae" (UID: "05dfc2e3-71af-4150-a4ca-02b5629083ae"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: E1216 09:11:49.368406 4823 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Dec 16 09:11:49 crc kubenswrapper[4823]: E1216 09:11:49.368486 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts podName:4da7ae09-cc1d-4f42-b1be-7045236d12e9 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:53.368468203 +0000 UTC m=+8191.857034326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts") pod "novacell12d63-account-delete-8c88v" (UID: "4da7ae09-cc1d-4f42-b1be-7045236d12e9") : configmap "openstack-cell1-scripts" not found Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.369074 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05dfc2e3-71af-4150-a4ca-02b5629083ae-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.465087 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodhf38e-account-delete-hxrkv"] Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.566907 4823 scope.go:117] "RemoveContainer" containerID="066b7853bc0b7a71c51ec7870fb1b2dfe729cc3a82e349d2bb9dcc2c91a68ff6" Dec 16 09:11:49 crc kubenswrapper[4823]: E1216 09:11:49.682080 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:49 crc kubenswrapper[4823]: E1216 09:11:49.684909 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:49 crc kubenswrapper[4823]: E1216 09:11:49.691209 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:49 crc kubenswrapper[4823]: E1216 09:11:49.691274 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.713359 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.774106 4823 scope.go:117] "RemoveContainer" containerID="06a0d06ef2a52866e48ef5baf829653c9eb89594b03bea4aa951dea2c39e0a1c" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783069 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-etc-swift\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783308 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-internal-tls-certs\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783462 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-public-tls-certs\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783580 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mqn5\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-kube-api-access-8mqn5\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783660 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-config-data\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783749 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-run-httpd\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.783940 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-combined-ca-bundle\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.784101 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-log-httpd\") pod \"2d676c2b-8cf1-4933-8f2b-641733d096fc\" (UID: \"2d676c2b-8cf1-4933-8f2b-641733d096fc\") " Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.785604 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.824058 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.832259 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aac7bd9-5925-4c54-b747-57320a350ab9" path="/var/lib/kubelet/pods/6aac7bd9-5925-4c54-b747-57320a350ab9/volumes" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.853444 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-kube-api-access-8mqn5" (OuterVolumeSpecName: "kube-api-access-8mqn5") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "kube-api-access-8mqn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.861066 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f60cf52-47f0-4efd-8479-64bcc13848cf","Type":"ContainerDied","Data":"90392d82387d45dbffa8a8d7ce481dd9c2592ac9876306fb265f6ddad2ddece8"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.861110 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90392d82387d45dbffa8a8d7ce481dd9c2592ac9876306fb265f6ddad2ddece8" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.871802 4823 scope.go:117] "RemoveContainer" containerID="ffe2b7b0714c510ef6a3fa2e5f42cfeebfd6329c0126d803409a205d66f2ddec" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.872368 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.872642 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.873560 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df55b6677-dqvsm"] Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.881526 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodhf38e-account-delete-hxrkv" event={"ID":"6ecabaef-9422-4e5c-bf83-df3b523b8fa7","Type":"ContainerStarted","Data":"bc402249c7f1d2470a20c55e31d2f62c03e4b834ca222fda192967a13581d2f8"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.893324 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.893357 4823 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.893369 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mqn5\" (UniqueName: \"kubernetes.io/projected/2d676c2b-8cf1-4933-8f2b-641733d096fc-kube-api-access-8mqn5\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.893382 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d676c2b-8cf1-4933-8f2b-641733d096fc-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.903170 4823 generic.go:334] "Generic (PLEG): container finished" podID="d6361c12-5d54-4919-aafe-4ac9b88e8c20" containerID="015ab879863fa955f1176357517ee4e9240bedcaf7dac1c3006bdf9a8fa13743" exitCode=0 Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.903464 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heatf1cb-account-delete-2mdnl" event={"ID":"d6361c12-5d54-4919-aafe-4ac9b88e8c20","Type":"ContainerDied","Data":"015ab879863fa955f1176357517ee4e9240bedcaf7dac1c3006bdf9a8fa13743"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.905360 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-df55b6677-dqvsm"] Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.917778 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican75d9-account-delete-x7xds" event={"ID":"1422dc66-68e5-403d-9e01-657d83772587","Type":"ContainerStarted","Data":"a743c22038bc13e5b952d64ec76e34d2c84b9885987049e0b038a31e63335500"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.928654 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif251-account-delete-dhnjg" event={"ID":"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725","Type":"ContainerStarted","Data":"c69cb34cb1b02d5fa8a3c6c2a83bc20184104220c1e0d57d6f0b8c7b9d0cf1d9"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.932432 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.935557 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"6ebc4a0e-1b85-400b-bc10-5d216d7431fb","Type":"ContainerDied","Data":"39a463dafb4c1ac5e51aa691068f37f8448d522c184cdf99e0ef84345df7c8f0"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.935666 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a463dafb4c1ac5e51aa691068f37f8448d522c184cdf99e0ef84345df7c8f0" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.940899 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.944666 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f058bf18-c31d-4b48-a183-bb9ae9223fbe","Type":"ContainerDied","Data":"04fbed9140fb34f53a53f73139b7550de89510410eb5bf68ae0e0477bd549fb9"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.944709 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04fbed9140fb34f53a53f73139b7550de89510410eb5bf68ae0e0477bd549fb9" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.947911 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" event={"ID":"2d676c2b-8cf1-4933-8f2b-641733d096fc","Type":"ContainerDied","Data":"29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.948013 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.953774 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8/ovsdbserver-sb/0.log" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.953868 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8","Type":"ContainerDied","Data":"ec6bb6a11a5c2013352dc44d193fe800db2ae48b152fedad7595d139852bee58"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.953891 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec6bb6a11a5c2013352dc44d193fe800db2ae48b152fedad7595d139852bee58" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.959288 4823 generic.go:334] "Generic (PLEG): container finished" podID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerID="8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a" exitCode=0 Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.959376 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4496b25e-2f39-453a-aa60-ffa74e9913c8","Type":"ContainerDied","Data":"8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.959411 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4496b25e-2f39-453a-aa60-ffa74e9913c8","Type":"ContainerDied","Data":"e767172d18a6ab67f2f2cb7e2024f66f103a1e4e96f08e1c6664458e1ff73f1b"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.959431 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e767172d18a6ab67f2f2cb7e2024f66f103a1e4e96f08e1c6664458e1ff73f1b" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.961361 4823 generic.go:334] "Generic (PLEG): container finished" podID="9b75df4d-61d8-4913-bea9-018339e8e2a8" containerID="c445ecf1f626f495730c517c5ab9b7d408c11b0d5a67ecbb9944fb24923e46ae" exitCode=0 Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.961416 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronc394-account-delete-lkcfh" event={"ID":"9b75df4d-61d8-4913-bea9-018339e8e2a8","Type":"ContainerDied","Data":"c445ecf1f626f495730c517c5ab9b7d408c11b0d5a67ecbb9944fb24923e46ae"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.963778 4823 generic.go:334] "Generic (PLEG): container finished" podID="5729be98-e3b4-42bd-92a6-913d63da1de3" containerID="dbd25f06ad8df284cbf23ab9bcd9065fe87cd340eea4b9ed12b23c03d32a3218" exitCode=0 Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.963836 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4d28-account-delete-cb8qx" event={"ID":"5729be98-e3b4-42bd-92a6-913d63da1de3","Type":"ContainerDied","Data":"dbd25f06ad8df284cbf23ab9bcd9065fe87cd340eea4b9ed12b23c03d32a3218"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.972102 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"868b7d1a-5039-4d72-9a41-d8e57b1df5d4","Type":"ContainerDied","Data":"28f41dbbcfe435b105f27ca14226ca13d0adc0ff95bbc1d708807284cd33f631"} Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.972192 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.982628 4823 generic.go:334] "Generic (PLEG): container finished" podID="d9880fe3-977f-473b-84c9-2cb6f65d588d" containerID="160a378f4cf5cfefa1529992f12007f187f5675511006a75337c4a426237781d" exitCode=0 Dec 16 09:11:49 crc kubenswrapper[4823]: I1216 09:11:49.982868 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderea34-account-delete-pjmhc" event={"ID":"d9880fe3-977f-473b-84c9-2cb6f65d588d","Type":"ContainerDied","Data":"160a378f4cf5cfefa1529992f12007f187f5675511006a75337c4a426237781d"} Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.000622 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj2jc\" (UniqueName: \"kubernetes.io/projected/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-kube-api-access-fj2jc\") pod \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.000666 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-vencrypt-tls-certs\") pod \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.000734 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-combined-ca-bundle\") pod \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.000829 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-nova-novncproxy-tls-certs\") pod \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.000864 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-config-data\") pod \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\" (UID: \"868b7d1a-5039-4d72-9a41-d8e57b1df5d4\") " Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.015173 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d06b91f8-1fcd-40fe-b712-0549d99258c6","Type":"ContainerDied","Data":"be4ced60375d9806a581283d0c18ec61e575bf49cc10b2db539a8ef146283427"} Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.015225 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be4ced60375d9806a581283d0c18ec61e575bf49cc10b2db539a8ef146283427" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.039004 4823 generic.go:334] "Generic (PLEG): container finished" podID="4da7ae09-cc1d-4f42-b1be-7045236d12e9" containerID="87f4e34f6b0d49eb81682e2a70ec5c56fbde9854947c3dc4f46ed10392cd9692" exitCode=1 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.039360 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell12d63-account-delete-8c88v" event={"ID":"4da7ae09-cc1d-4f42-b1be-7045236d12e9","Type":"ContainerDied","Data":"87f4e34f6b0d49eb81682e2a70ec5c56fbde9854947c3dc4f46ed10392cd9692"} Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.043235 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.043622 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-central-agent" containerID="cri-o://18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.043795 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="proxy-httpd" containerID="cri-o://dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.043849 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="sg-core" containerID="cri-o://d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.043897 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-notification-agent" containerID="cri-o://79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.102442 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.102760 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3ee97b1f-ce61-45ef-97e1-642cc13ef521" containerName="kube-state-metrics" containerID="cri-o://9c2fb0e25d5692eb7a90933e0f8cf60671619d23ad83cd29dd61e8449d4f5dfe" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.109834 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-kube-api-access-fj2jc" (OuterVolumeSpecName: "kube-api-access-fj2jc") pod "868b7d1a-5039-4d72-9a41-d8e57b1df5d4" (UID: "868b7d1a-5039-4d72-9a41-d8e57b1df5d4"). InnerVolumeSpecName "kube-api-access-fj2jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.141740 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj2jc\" (UniqueName: \"kubernetes.io/projected/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-kube-api-access-fj2jc\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.267429 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.267920 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="c90aab28-60fa-4cdc-a89a-bd041351015d" containerName="memcached" containerID="cri-o://f09c828395889bafb8967722bc9fe10e42c34bbfc0a893f1c82cb91b42750c4b" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.388317 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-8589448fc-qj569" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.122:8000/healthcheck\": read tcp 10.217.0.2:54942->10.217.1.122:8000: read: connection reset by peer" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.395238 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-ddfd865c7-nhsh6" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.121:8004/healthcheck\": read tcp 10.217.0.2:51498->10.217.1.121:8004: read: connection reset by peer" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.402935 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b64c64d55-q7zxm"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.403213 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-b64c64d55-q7zxm" podUID="e5b08afc-bfe3-4938-ac42-3781d1290201" containerName="keystone-api" containerID="cri-o://5a71791d0d178cb3e2f0ca5f41f8f5be586775d58f1659933d5697c3e1b3e765" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.417413 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "868b7d1a-5039-4d72-9a41-d8e57b1df5d4" (UID: "868b7d1a-5039-4d72-9a41-d8e57b1df5d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.425373 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-cron-29431261-swvz9"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.446096 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.60:8776/healthcheck\": read tcp 10.217.0.2:34866->10.217.1.60:8776: read: connection reset by peer" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.459604 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.488009 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-cron-29431261-swvz9"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.492363 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.503542 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.503614 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "868b7d1a-5039-4d72-9a41-d8e57b1df5d4" (UID: "868b7d1a-5039-4d72-9a41-d8e57b1df5d4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.535714 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-gr9jj"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.535904 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-config-data" (OuterVolumeSpecName: "config-data") pod "868b7d1a-5039-4d72-9a41-d8e57b1df5d4" (UID: "868b7d1a-5039-4d72-9a41-d8e57b1df5d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.540638 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-config-data" (OuterVolumeSpecName: "config-data") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.544239 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-gr9jj"] Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.560705 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.560929 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-f38e-account-create-update-kgfvh"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.562720 4823 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.562747 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.562761 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.562772 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.566418 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.571461 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-f38e-account-create-update-kgfvh"] Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.577586 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c is running failed: container process not found" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.578671 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodhf38e-account-delete-hxrkv"] Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.579105 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c is running failed: container process not found" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.585158 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c is running failed: container process not found" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.585223 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="ovsdbserver-sb" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.598089 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d656d958d-tmzmp" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.40:9311/healthcheck\": read tcp 10.217.0.2:47694->10.217.1.40:9311: read: connection reset by peer" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.598308 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d656d958d-tmzmp" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.40:9311/healthcheck\": read tcp 10.217.0.2:47678->10.217.1.40:9311: read: connection reset by peer" Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.598388 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:11:50 crc kubenswrapper[4823]: E1216 09:11:50.598424 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.623162 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.623163 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "868b7d1a-5039-4d72-9a41-d8e57b1df5d4" (UID: "868b7d1a-5039-4d72-9a41-d8e57b1df5d4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.660435 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d676c2b-8cf1-4933-8f2b-641733d096fc" (UID: "2d676c2b-8cf1-4933-8f2b-641733d096fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.665682 4823 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b7d1a-5039-4d72-9a41-d8e57b1df5d4-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.665710 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.665720 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d676c2b-8cf1-4933-8f2b-641733d096fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.686642 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5948ddcb4-f5qgv" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.111:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8443: connect: connection refused" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.961040 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerName="galera" containerID="cri-o://721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71" gracePeriod=30 Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.983420 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.100:8775/\": read tcp 10.217.0.2:35260->10.217.1.100:8775: read: connection reset by peer" Dec 16 09:11:50 crc kubenswrapper[4823]: I1216 09:11:50.983846 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.100:8775/\": read tcp 10.217.0.2:35262->10.217.1.100:8775: read: connection reset by peer" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.059452 4823 generic.go:334] "Generic (PLEG): container finished" podID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerID="9a306cbeecf35df7308d1553cc064c30c8abbe4a6a369ff751b3831a552d0f27" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.059514 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d656d958d-tmzmp" event={"ID":"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec","Type":"ContainerDied","Data":"9a306cbeecf35df7308d1553cc064c30c8abbe4a6a369ff751b3831a552d0f27"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.065199 4823 generic.go:334] "Generic (PLEG): container finished" podID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerID="43f2f25511680e01631c9fea0525d1784511e8f0fbc8bdc295206a3b91483591" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.065333 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fcf4dff7-84zz6" event={"ID":"341f00a5-410a-4656-876e-a6b0cfe2a4df","Type":"ContainerDied","Data":"43f2f25511680e01631c9fea0525d1784511e8f0fbc8bdc295206a3b91483591"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.075448 4823 generic.go:334] "Generic (PLEG): container finished" podID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerID="44afd549f065376806e3735489d6257a4793e59063b189217a6eecd50e0f1af0" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.075529 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8589448fc-qj569" event={"ID":"7f35ecc1-21e4-461e-91d3-3da96745fed6","Type":"ContainerDied","Data":"44afd549f065376806e3735489d6257a4793e59063b189217a6eecd50e0f1af0"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.085820 4823 generic.go:334] "Generic (PLEG): container finished" podID="60956cfa-c484-445d-af87-52713ccf4d09" containerID="4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.085857 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60956cfa-c484-445d-af87-52713ccf4d09","Type":"ContainerDied","Data":"4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.088597 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2f37-account-delete-t22qt" event={"ID":"3be6f063-aed2-4468-9cd3-f7f03bd28211","Type":"ContainerDied","Data":"8105dba2683062040a896c190887f2594a024cde0b9aa17865595c93ede931ce"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.088631 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8105dba2683062040a896c190887f2594a024cde0b9aa17865595c93ede931ce" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.091882 4823 generic.go:334] "Generic (PLEG): container finished" podID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerID="721ddb3d721e21d50c0be2952ef296f0188553dcfea31e2f3a2d25c394c2d3b6" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.091984 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" event={"ID":"7a50033a-9a6e-42e3-ac23-de2a24654b0f","Type":"ContainerDied","Data":"721ddb3d721e21d50c0be2952ef296f0188553dcfea31e2f3a2d25c394c2d3b6"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.094983 4823 generic.go:334] "Generic (PLEG): container finished" podID="3ee97b1f-ce61-45ef-97e1-642cc13ef521" containerID="9c2fb0e25d5692eb7a90933e0f8cf60671619d23ad83cd29dd61e8449d4f5dfe" exitCode=2 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.095123 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ee97b1f-ce61-45ef-97e1-642cc13ef521","Type":"ContainerDied","Data":"9c2fb0e25d5692eb7a90933e0f8cf60671619d23ad83cd29dd61e8449d4f5dfe"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.112387 4823 generic.go:334] "Generic (PLEG): container finished" podID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerID="9e972ac6360c9fcb45fd20ef40bf7e8972136fa235df75bc1dfcacbfb25e23ed" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.112448 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-ddfd865c7-nhsh6" event={"ID":"f8b8d93d-24db-4382-9077-7404605c7cf1","Type":"ContainerDied","Data":"9e972ac6360c9fcb45fd20ef40bf7e8972136fa235df75bc1dfcacbfb25e23ed"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.119782 4823 generic.go:334] "Generic (PLEG): container finished" podID="91f5097e-d643-4598-9d06-39f14f913291" containerID="77f7563a34f0a287066ce2a8f04f92118c66c8f3bccb6cfd97b6587b049219b8" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.119902 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91f5097e-d643-4598-9d06-39f14f913291","Type":"ContainerDied","Data":"77f7563a34f0a287066ce2a8f04f92118c66c8f3bccb6cfd97b6587b049219b8"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.123121 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a40068b-87bc-4af6-862d-ad33696041b3","Type":"ContainerDied","Data":"cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.127458 4823 generic.go:334] "Generic (PLEG): container finished" podID="2a40068b-87bc-4af6-862d-ad33696041b3" containerID="cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.130193 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodhf38e-account-delete-hxrkv" event={"ID":"6ecabaef-9422-4e5c-bf83-df3b523b8fa7","Type":"ContainerStarted","Data":"31913fa281143c1606422328777000ca5e5453f2293c31a874dc34f60925d2e3"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.130762 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/aodhf38e-account-delete-hxrkv" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.140435 4823 generic.go:334] "Generic (PLEG): container finished" podID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerID="651f28a2c721b5b4308bee72f9032a131e2c7f4a064121891960b81b54b65133" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.140503 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7454ff977b-h6fwh" event={"ID":"44c54ba6-36e8-4608-ab54-965ab4bdcef2","Type":"ContainerDied","Data":"651f28a2c721b5b4308bee72f9032a131e2c7f4a064121891960b81b54b65133"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.154140 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodhf38e-account-delete-hxrkv" podStartSLOduration=6.154120999 podStartE2EDuration="6.154120999s" podCreationTimestamp="2025-12-16 09:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:11:51.146092898 +0000 UTC m=+8189.634659031" watchObservedRunningTime="2025-12-16 09:11:51.154120999 +0000 UTC m=+8189.642687122" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.154921 4823 generic.go:334] "Generic (PLEG): container finished" podID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerID="609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.154995 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5948ddcb4-f5qgv" event={"ID":"6d650b48-8848-4495-9b48-fdf7472cc19e","Type":"ContainerDied","Data":"609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.156933 4823 generic.go:334] "Generic (PLEG): container finished" podID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.157047 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf","Type":"ContainerDied","Data":"f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.159008 4823 generic.go:334] "Generic (PLEG): container finished" podID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerID="dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345" exitCode=0 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.159045 4823 generic.go:334] "Generic (PLEG): container finished" podID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerID="d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3" exitCode=2 Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.159223 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerDied","Data":"dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345"} Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.159250 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerDied","Data":"d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3"} Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.181880 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.181983 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:51.68195631 +0000 UTC m=+8190.170522503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.386363 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.386615 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data podName:bf14ab2c-212b-406f-b102-2a4b8a7a29f5 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:59.386600645 +0000 UTC m=+8197.875166768 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data") pod "rabbitmq-cell1-server-0" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5") : configmap "rabbitmq-cell1-config-data" not found Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.598172 4823 scope.go:117] "RemoveContainer" containerID="e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.632078 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.633949 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ee97b1f_ce61_45ef_97e1_642cc13ef521.slice/crio-9c2fb0e25d5692eb7a90933e0f8cf60671619d23ad83cd29dd61e8449d4f5dfe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91080e73_6479_4c8b_bb2f_decdc0ade67e.slice/crio-conmon-d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a40068b_87bc_4af6_862d_ad33696041b3.slice/crio-cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60956cfa_c484_445d_af87_52713ccf4d09.slice/crio-4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a40068b_87bc_4af6_862d_ad33696041b3.slice/crio-conmon-cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44c54ba6_36e8_4608_ab54_965ab4bdcef2.slice/crio-conmon-651f28a2c721b5b4308bee72f9032a131e2c7f4a064121891960b81b54b65133.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91080e73_6479_4c8b_bb2f_decdc0ade67e.slice/crio-d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60956cfa_c484_445d_af87_52713ccf4d09.slice/crio-conmon-4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d676c2b_8cf1_4933_8f2b_641733d096fc.slice\": RecentStats: unable to find data in memory cache]" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.649514 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5bbf6c4b7b-7qpq6"] Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.656542 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5bbf6c4b7b-7qpq6"] Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.695475 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-cluster-tls-config\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.696004 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-web-config\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.696092 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-out\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.696169 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-tls-assets\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.696204 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzsw\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-kube-api-access-fqzsw\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.696261 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-alertmanager-metric-storage-db\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.696307 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-volume\") pod \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\" (UID: \"6ebc4a0e-1b85-400b-bc10-5d216d7431fb\") " Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.697074 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-alertmanager-metric-storage-db" (OuterVolumeSpecName: "alertmanager-metric-storage-db") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "alertmanager-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.697102 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.697210 4823 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.697200 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:52.697180155 +0000 UTC m=+8191.185746358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.697320 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data podName:f8b8d93d-24db-4382-9077-7404605c7cf1 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:59.697297919 +0000 UTC m=+8198.185864112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data") pod "heat-api-ddfd865c7-nhsh6" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1") : secret "heat-config-data" not found Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.702386 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.728870 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.746167 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-kube-api-access-fqzsw" (OuterVolumeSpecName: "kube-api-access-fqzsw") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "kube-api-access-fqzsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.751287 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-out" (OuterVolumeSpecName: "config-out") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.798900 4823 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.798930 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzsw\" (UniqueName: \"kubernetes.io/projected/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-kube-api-access-fqzsw\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.798941 4823 reconciler_common.go:293] "Volume detached for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-alertmanager-metric-storage-db\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.798971 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.798982 4823 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-config-out\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.816429 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" path="/var/lib/kubelet/pods/05dfc2e3-71af-4150-a4ca-02b5629083ae/volumes" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.817796 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd89a4f-2cd4-4458-bb77-a56113b28c38" path="/var/lib/kubelet/pods/0dd89a4f-2cd4-4458-bb77-a56113b28c38/volumes" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.818647 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" path="/var/lib/kubelet/pods/2d676c2b-8cf1-4933-8f2b-641733d096fc/volumes" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.820793 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64eb539a-acff-4e76-bdaa-db24b9abed39" path="/var/lib/kubelet/pods/64eb539a-acff-4e76-bdaa-db24b9abed39/volumes" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.821322 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" path="/var/lib/kubelet/pods/99ce1c86-eccc-4f3c-b999-18774e823763/volumes" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.821912 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9d47387-c96f-4154-be1c-eda89c0e2a77" path="/var/lib/kubelet/pods/f9d47387-c96f-4154-be1c-eda89c0e2a77/volumes" Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.910159 4823 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 09:11:51 crc kubenswrapper[4823]: E1216 09:11:51.910259 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data podName:cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:59.910236783 +0000 UTC m=+8198.398802906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data") pod "rabbitmq-server-0" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7") : configmap "rabbitmq-config-data" not found Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.918583 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-web-config" (OuterVolumeSpecName: "web-config") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:51 crc kubenswrapper[4823]: I1216 09:11:51.925795 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "6ebc4a0e-1b85-400b-bc10-5d216d7431fb" (UID: "6ebc4a0e-1b85-400b-bc10-5d216d7431fb"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.017248 4823 reconciler_common.go:293] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-cluster-tls-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.017285 4823 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ebc4a0e-1b85-400b-bc10-5d216d7431fb-web-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.176514 4823 generic.go:334] "Generic (PLEG): container finished" podID="c90aab28-60fa-4cdc-a89a-bd041351015d" containerID="f09c828395889bafb8967722bc9fe10e42c34bbfc0a893f1c82cb91b42750c4b" exitCode=0 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.190179 4823 generic.go:334] "Generic (PLEG): container finished" podID="7835251f-9e66-445c-9581-0422195cdc2b" containerID="a5d1441362a86b2c2d9982c87441fe14ef4f629cfb562ad9b5a4d225170c8fcd" exitCode=1 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.190891 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0da3b-account-delete-2w9zh" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.190938 4823 scope.go:117] "RemoveContainer" containerID="a5d1441362a86b2c2d9982c87441fe14ef4f629cfb562ad9b5a4d225170c8fcd" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.195198 4823 generic.go:334] "Generic (PLEG): container finished" podID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerID="7ac81f3d4f6c3b985e28f223f6d2d8ddf14deedc2445d7c0e81bfa4724f713b5" exitCode=0 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.199540 4823 generic.go:334] "Generic (PLEG): container finished" podID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerID="5212d6dbaaa8dbff11343b20aba28ce9285acc49da6ca4a23b65acdd32b790c4" exitCode=1 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.199896 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapif251-account-delete-dhnjg" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.199925 4823 scope.go:117] "RemoveContainer" containerID="5212d6dbaaa8dbff11343b20aba28ce9285acc49da6ca4a23b65acdd32b790c4" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.221833 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.221900 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts podName:7835251f-9e66-445c-9581-0422195cdc2b nodeName:}" failed. No retries permitted until 2025-12-16 09:11:52.721882626 +0000 UTC m=+8191.210448749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts") pod "novacell0da3b-account-delete-2w9zh" (UID: "7835251f-9e66-445c-9581-0422195cdc2b") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.230370 4823 generic.go:334] "Generic (PLEG): container finished" podID="1422dc66-68e5-403d-9e01-657d83772587" containerID="f84baa8a3ccd8709b5e794391f08e7c7bb93715323032cfa80e8f8e30a77c27c" exitCode=1 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.231463 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican75d9-account-delete-x7xds" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.231530 4823 scope.go:117] "RemoveContainer" containerID="f84baa8a3ccd8709b5e794391f08e7c7bb93715323032cfa80e8f8e30a77c27c" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.268657 4823 generic.go:334] "Generic (PLEG): container finished" podID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerID="18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807" exitCode=0 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.273776 4823 generic.go:334] "Generic (PLEG): container finished" podID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerID="03aaea60579a32dbd22e959a4c109e38b799c758c8b0d9ef37082c0af8297906" exitCode=0 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.278563 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodhf38e-account-delete-hxrkv" podUID="6ecabaef-9422-4e5c-bf83-df3b523b8fa7" containerName="mariadb-account-delete" containerID="cri-o://31913fa281143c1606422328777000ca5e5453f2293c31a874dc34f60925d2e3" gracePeriod=30 Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.278902 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.324330 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.324407 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts podName:1422dc66-68e5-403d-9e01-657d83772587 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:52.824388835 +0000 UTC m=+8191.312954958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts") pod "barbican75d9-account-delete-x7xds" (UID: "1422dc66-68e5-403d-9e01-657d83772587") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.324662 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.324692 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts podName:25a0f697-45ab-48cd-b4e2-d5e8bcd3b725 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:52.824683064 +0000 UTC m=+8191.313249277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts") pod "novaapif251-account-delete-dhnjg" (UID: "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.412149 4823 scope.go:117] "RemoveContainer" containerID="f591fd2607a4fd35339882b147b37bb251be764b4a5f7303532620e154218301" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422657 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a40068b-87bc-4af6-862d-ad33696041b3","Type":"ContainerDied","Data":"77efb9a2ed0b266ccfd4ab4761eb7b41df522e2be97e7a859ec7a551cd547d40"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422707 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77efb9a2ed0b266ccfd4ab4761eb7b41df522e2be97e7a859ec7a551cd547d40" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422738 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c90aab28-60fa-4cdc-a89a-bd041351015d","Type":"ContainerDied","Data":"f09c828395889bafb8967722bc9fe10e42c34bbfc0a893f1c82cb91b42750c4b"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422764 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell12d63-account-delete-8c88v" event={"ID":"4da7ae09-cc1d-4f42-b1be-7045236d12e9","Type":"ContainerDied","Data":"4c0915f59df3483bd25df0c2cabf92b6f536576bc00a55bf3ef588b9efc4843a"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422782 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c0915f59df3483bd25df0c2cabf92b6f536576bc00a55bf3ef588b9efc4843a" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422803 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60956cfa-c484-445d-af87-52713ccf4d09","Type":"ContainerDied","Data":"11521121ebfde6521f17a69f42825d1def922a1e72b4028195b3d3d4fd35c7ff"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422819 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11521121ebfde6521f17a69f42825d1def922a1e72b4028195b3d3d4fd35c7ff" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422834 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronc394-account-delete-lkcfh" event={"ID":"9b75df4d-61d8-4913-bea9-018339e8e2a8","Type":"ContainerDied","Data":"8a0efda60bcc4f8c2ac4c1cbe3bed62a6fa15174b36b9100746470ae259f0204"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422851 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0efda60bcc4f8c2ac4c1cbe3bed62a6fa15174b36b9100746470ae259f0204" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422862 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"91f5097e-d643-4598-9d06-39f14f913291","Type":"ContainerDied","Data":"20086cec8f2868089f12b793bf9b74ce7e74dea7026f0198636653acaf62d2c2"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422879 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20086cec8f2868089f12b793bf9b74ce7e74dea7026f0198636653acaf62d2c2" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422891 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf","Type":"ContainerDied","Data":"8a269b71857456478d8798472fe05b6a0f92634ee338462d971404821b95ca84"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422906 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a269b71857456478d8798472fe05b6a0f92634ee338462d971404821b95ca84" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422916 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0da3b-account-delete-2w9zh" event={"ID":"7835251f-9e66-445c-9581-0422195cdc2b","Type":"ContainerDied","Data":"a5d1441362a86b2c2d9982c87441fe14ef4f629cfb562ad9b5a4d225170c8fcd"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a","Type":"ContainerDied","Data":"7ac81f3d4f6c3b985e28f223f6d2d8ddf14deedc2445d7c0e81bfa4724f713b5"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422948 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a","Type":"ContainerDied","Data":"e97823a6994d76dcbaa45e655367561d28af036bedcffbbf1fdcd8c984ca3f0d"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422959 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e97823a6994d76dcbaa45e655367561d28af036bedcffbbf1fdcd8c984ca3f0d" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422969 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif251-account-delete-dhnjg" event={"ID":"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725","Type":"ContainerDied","Data":"5212d6dbaaa8dbff11343b20aba28ce9285acc49da6ca4a23b65acdd32b790c4"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422983 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" event={"ID":"7a50033a-9a6e-42e3-ac23-de2a24654b0f","Type":"ContainerDied","Data":"ad028104b191f0524983431165c5f6dc3558f2981b9dea3f4cdb00404867668b"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.422997 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad028104b191f0524983431165c5f6dc3558f2981b9dea3f4cdb00404867668b" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423008 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ee97b1f-ce61-45ef-97e1-642cc13ef521","Type":"ContainerDied","Data":"50fa1656455665eb191592ec56027f7b91faa2b2f26b2a004c1a13b50e4fa632"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423040 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50fa1656455665eb191592ec56027f7b91faa2b2f26b2a004c1a13b50e4fa632" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423052 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fcf4dff7-84zz6" event={"ID":"341f00a5-410a-4656-876e-a6b0cfe2a4df","Type":"ContainerDied","Data":"953044d58368dd6dfc4f7bd3dae87ac1eba4376c14f5723e22ec576d8c8737ea"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423066 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="953044d58368dd6dfc4f7bd3dae87ac1eba4376c14f5723e22ec576d8c8737ea" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423078 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican75d9-account-delete-x7xds" event={"ID":"1422dc66-68e5-403d-9e01-657d83772587","Type":"ContainerDied","Data":"f84baa8a3ccd8709b5e794391f08e7c7bb93715323032cfa80e8f8e30a77c27c"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423092 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8589448fc-qj569" event={"ID":"7f35ecc1-21e4-461e-91d3-3da96745fed6","Type":"ContainerDied","Data":"a02a72e5bc1993985df5b00f6e12bac6bfccaa68ebd6fc0301f5b7f0313e55db"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423107 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a02a72e5bc1993985df5b00f6e12bac6bfccaa68ebd6fc0301f5b7f0313e55db" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423118 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heatf1cb-account-delete-2mdnl" event={"ID":"d6361c12-5d54-4919-aafe-4ac9b88e8c20","Type":"ContainerDied","Data":"366cc0daa1c0f0f30d057976cd5ee4c1587a201ce4e55778ce68198186eb5825"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423131 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="366cc0daa1c0f0f30d057976cd5ee4c1587a201ce4e55778ce68198186eb5825" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423142 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7454ff977b-h6fwh" event={"ID":"44c54ba6-36e8-4608-ab54-965ab4bdcef2","Type":"ContainerDied","Data":"f799eb9828876f8d4b4204046816d340c29dff9c3e1fb6d7bfc6a096535e7ee0"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423154 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f799eb9828876f8d4b4204046816d340c29dff9c3e1fb6d7bfc6a096535e7ee0" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423166 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerDied","Data":"18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423215 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d656d958d-tmzmp" event={"ID":"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec","Type":"ContainerDied","Data":"a0d0509f348f194d87474c81cc58092e35c1372cd79eb70d60ee75b795c75b95"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423230 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0d0509f348f194d87474c81cc58092e35c1372cd79eb70d60ee75b795c75b95" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423242 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cffdbd32-0155-4dd0-897d-9e406fd5e2ee","Type":"ContainerDied","Data":"03aaea60579a32dbd22e959a4c109e38b799c758c8b0d9ef37082c0af8297906"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423257 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cffdbd32-0155-4dd0-897d-9e406fd5e2ee","Type":"ContainerDied","Data":"33a6413e072e3077978a096fa163d1cdb4054acf5bfd1877efc9c4bde951a128"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423268 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a6413e072e3077978a096fa163d1cdb4054acf5bfd1877efc9c4bde951a128" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423279 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-ddfd865c7-nhsh6" event={"ID":"f8b8d93d-24db-4382-9077-7404605c7cf1","Type":"ContainerDied","Data":"7ef8314d95f6bf55c632d1c16580e1cd82d6d9038b9cb26fdf59a1511cbce48b"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423293 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ef8314d95f6bf55c632d1c16580e1cd82d6d9038b9cb26fdf59a1511cbce48b" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423305 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4d28-account-delete-cb8qx" event={"ID":"5729be98-e3b4-42bd-92a6-913d63da1de3","Type":"ContainerDied","Data":"1c350b1494002b5a30d77ad31609f784399a826474e865f7bd12ac6cc3e1aa4c"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423317 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c350b1494002b5a30d77ad31609f784399a826474e865f7bd12ac6cc3e1aa4c" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423328 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderea34-account-delete-pjmhc" event={"ID":"d9880fe3-977f-473b-84c9-2cb6f65d588d","Type":"ContainerDied","Data":"654c4d8ede8217ced8f3e0dfc87eb2327a0e63180f5fd8d72925ccda0f665bb4"} Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.423345 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="654c4d8ede8217ced8f3e0dfc87eb2327a0e63180f5fd8d72925ccda0f665bb4" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.427658 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.428904 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8/ovsdbserver-sb/0.log" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.429442 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.440229 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.459482 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.462500 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.468761 4823 scope.go:117] "RemoveContainer" containerID="57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.480325 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.480392 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerName="nova-cell0-conductor-conductor" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.486825 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.490676 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.494368 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.496324 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.502282 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529185 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-config\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529232 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-tls-assets\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529274 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-public-tls-certs\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529312 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-scripts\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529334 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529358 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529381 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdbserver-sb-tls-certs\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529404 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-logs\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529428 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8njnm\" (UniqueName: \"kubernetes.io/projected/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-kube-api-access-8njnm\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529463 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-config-data\") pod \"6f60cf52-47f0-4efd-8479-64bcc13848cf\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.529479 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-combined-ca-bundle\") pod \"6f60cf52-47f0-4efd-8479-64bcc13848cf\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.530942 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531036 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdb-rundir\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531078 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-combined-ca-bundle\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531097 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-httpd-run\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531115 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v5zd\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-kube-api-access-4v5zd\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531148 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531174 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-combined-ca-bundle\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531200 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr68m\" (UniqueName: \"kubernetes.io/projected/d06b91f8-1fcd-40fe-b712-0549d99258c6-kube-api-access-wr68m\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531236 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-thanos-prometheus-http-client-file\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531282 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-metrics-certs-tls-certs\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531313 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531394 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531421 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-config-data\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531463 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-secret-combined-ca-bundle\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531540 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config-out\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531574 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f058bf18-c31d-4b48-a183-bb9ae9223fbe-prometheus-metric-storage-rulefiles-0\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531596 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-scripts\") pod \"d06b91f8-1fcd-40fe-b712-0549d99258c6\" (UID: \"d06b91f8-1fcd-40fe-b712-0549d99258c6\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.531624 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxglw\" (UniqueName: \"kubernetes.io/projected/6f60cf52-47f0-4efd-8479-64bcc13848cf-kube-api-access-nxglw\") pod \"6f60cf52-47f0-4efd-8479-64bcc13848cf\" (UID: \"6f60cf52-47f0-4efd-8479-64bcc13848cf\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.533641 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-config" (OuterVolumeSpecName: "config") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.544080 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.546370 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-scripts" (OuterVolumeSpecName: "scripts") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.548117 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f058bf18-c31d-4b48-a183-bb9ae9223fbe-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.550805 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-logs" (OuterVolumeSpecName: "logs") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.560275 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06b91f8-1fcd-40fe-b712-0549d99258c6-kube-api-access-wr68m" (OuterVolumeSpecName: "kube-api-access-wr68m") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "kube-api-access-wr68m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.561228 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.561809 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.561952 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-scripts" (OuterVolumeSpecName: "scripts") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.561998 4823 scope.go:117] "RemoveContainer" containerID="57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.562073 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config" (OuterVolumeSpecName: "config") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.562260 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.562514 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-kube-api-access-4v5zd" (OuterVolumeSpecName: "kube-api-access-4v5zd") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "kube-api-access-4v5zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.551115 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.562695 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f60cf52-47f0-4efd-8479-64bcc13848cf-kube-api-access-nxglw" (OuterVolumeSpecName: "kube-api-access-nxglw") pod "6f60cf52-47f0-4efd-8479-64bcc13848cf" (UID: "6f60cf52-47f0-4efd-8479-64bcc13848cf"). InnerVolumeSpecName "kube-api-access-nxglw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.573157 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.587107 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.589541 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.590166 4823 scope.go:117] "RemoveContainer" containerID="d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.590289 4823 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_proxy-server_swift-proxy-5bbf6c4b7b-7qpq6_openstack_2d676c2b-8cf1-4933-8f2b-641733d096fc_0 in pod sandbox 29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782 from index: no such id: '57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392'" containerID="57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.590313 4823 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_proxy-server_swift-proxy-5bbf6c4b7b-7qpq6_openstack_2d676c2b-8cf1-4933-8f2b-641733d096fc_0 in pod sandbox 29dd8432f55325fd637e50b97c0469c645dd8b2b0ca6199304a41985de194782 from index: no such id: '57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392'" containerID="57d0046d03d662cbe48ec832a46a5f0d40cf1316633fc87176e963b49f32c392" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.590331 4823 scope.go:117] "RemoveContainer" containerID="5d6cc389cc0a251a9367e2e3b78544eb67ee1e7cee3e16ec15b4a605c6de77ee" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.618573 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-kube-api-access-8njnm" (OuterVolumeSpecName: "kube-api-access-8njnm") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "kube-api-access-8njnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.618642 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.618695 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config-out" (OuterVolumeSpecName: "config-out") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.634829 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-default\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.634918 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-kolla-config\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635044 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk2td\" (UniqueName: \"kubernetes.io/projected/4496b25e-2f39-453a-aa60-ffa74e9913c8-kube-api-access-jk2td\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635118 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-operator-scripts\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635589 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635654 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-galera-tls-certs\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635703 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-generated\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635730 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-combined-ca-bundle\") pod \"4496b25e-2f39-453a-aa60-ffa74e9913c8\" (UID: \"4496b25e-2f39-453a-aa60-ffa74e9913c8\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635765 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt5ts\" (UniqueName: \"kubernetes.io/projected/3be6f063-aed2-4468-9cd3-f7f03bd28211-kube-api-access-wt5ts\") pod \"3be6f063-aed2-4468-9cd3-f7f03bd28211\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.635870 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3be6f063-aed2-4468-9cd3-f7f03bd28211-operator-scripts\") pod \"3be6f063-aed2-4468-9cd3-f7f03bd28211\" (UID: \"3be6f063-aed2-4468-9cd3-f7f03bd28211\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637163 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637184 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637199 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v5zd\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-kube-api-access-4v5zd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637210 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637220 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr68m\" (UniqueName: \"kubernetes.io/projected/d06b91f8-1fcd-40fe-b712-0549d99258c6-kube-api-access-wr68m\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637229 4823 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637241 4823 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637252 4823 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637263 4823 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f058bf18-c31d-4b48-a183-bb9ae9223fbe-config-out\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637271 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637285 4823 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f058bf18-c31d-4b48-a183-bb9ae9223fbe-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637296 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxglw\" (UniqueName: \"kubernetes.io/projected/6f60cf52-47f0-4efd-8479-64bcc13848cf-kube-api-access-nxglw\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637330 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637343 4823 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f058bf18-c31d-4b48-a183-bb9ae9223fbe-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637926 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637940 4823 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637953 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06b91f8-1fcd-40fe-b712-0549d99258c6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.637967 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8njnm\" (UniqueName: \"kubernetes.io/projected/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-kube-api-access-8njnm\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.638589 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3be6f063-aed2-4468-9cd3-f7f03bd28211-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3be6f063-aed2-4468-9cd3-f7f03bd28211" (UID: "3be6f063-aed2-4468-9cd3-f7f03bd28211"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.645737 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.649122 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.649141 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.649187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.650817 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.670595 4823 scope.go:117] "RemoveContainer" containerID="e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.683290 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be6f063-aed2-4468-9cd3-f7f03bd28211-kube-api-access-wt5ts" (OuterVolumeSpecName: "kube-api-access-wt5ts") pod "3be6f063-aed2-4468-9cd3-f7f03bd28211" (UID: "3be6f063-aed2-4468-9cd3-f7f03bd28211"). InnerVolumeSpecName "kube-api-access-wt5ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.691720 4823 scope.go:117] "RemoveContainer" containerID="615a16406b9429b165234ee72e932cf0475f7803064246d04f5e66b621b3395b" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.694217 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4496b25e-2f39-453a-aa60-ffa74e9913c8-kube-api-access-jk2td" (OuterVolumeSpecName: "kube-api-access-jk2td") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "kube-api-access-jk2td". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.738892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jshbq\" (UniqueName: \"kubernetes.io/projected/4da7ae09-cc1d-4f42-b1be-7045236d12e9-kube-api-access-jshbq\") pod \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.739055 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts\") pod \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\" (UID: \"4da7ae09-cc1d-4f42-b1be-7045236d12e9\") " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.739507 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4da7ae09-cc1d-4f42-b1be-7045236d12e9" (UID: "4da7ae09-cc1d-4f42-b1be-7045236d12e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.740330 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.740428 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.740510 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:54.740487777 +0000 UTC m=+8193.229053940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740430 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740729 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4da7ae09-cc1d-4f42-b1be-7045236d12e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740740 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740752 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt5ts\" (UniqueName: \"kubernetes.io/projected/3be6f063-aed2-4468-9cd3-f7f03bd28211-kube-api-access-wt5ts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740760 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3be6f063-aed2-4468-9cd3-f7f03bd28211-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740768 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740784 4823 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4496b25e-2f39-453a-aa60-ffa74e9913c8-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.740793 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk2td\" (UniqueName: \"kubernetes.io/projected/4496b25e-2f39-453a-aa60-ffa74e9913c8-kube-api-access-jk2td\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.741304 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts podName:7835251f-9e66-445c-9581-0422195cdc2b nodeName:}" failed. No retries permitted until 2025-12-16 09:11:53.741286083 +0000 UTC m=+8192.229852246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts") pod "novacell0da3b-account-delete-2w9zh" (UID: "7835251f-9e66-445c-9581-0422195cdc2b") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.754180 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4da7ae09-cc1d-4f42-b1be-7045236d12e9-kube-api-access-jshbq" (OuterVolumeSpecName: "kube-api-access-jshbq") pod "4da7ae09-cc1d-4f42-b1be-7045236d12e9" (UID: "4da7ae09-cc1d-4f42-b1be-7045236d12e9"). InnerVolumeSpecName "kube-api-access-jshbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.769014 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b podName:6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:53.26898764 +0000 UTC m=+8191.757553763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "ovndbcluster-sb-etc-ovn" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.844349 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jshbq\" (UniqueName: \"kubernetes.io/projected/4da7ae09-cc1d-4f42-b1be-7045236d12e9-kube-api-access-jshbq\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.844510 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.844568 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts podName:25a0f697-45ab-48cd-b4e2-d5e8bcd3b725 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:53.844549735 +0000 UTC m=+8192.333115858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts") pod "novaapif251-account-delete-dhnjg" (UID: "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.845006 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.845182 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts podName:1422dc66-68e5-403d-9e01-657d83772587 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:53.845147623 +0000 UTC m=+8192.333713917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts") pod "barbican75d9-account-delete-x7xds" (UID: "1422dc66-68e5-403d-9e01-657d83772587") : configmap "openstack-scripts" not found Dec 16 09:11:52 crc kubenswrapper[4823]: E1216 09:11:52.906963 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd podName:f058bf18-c31d-4b48-a183-bb9ae9223fbe nodeName:}" failed. No retries permitted until 2025-12-16 09:11:53.406935407 +0000 UTC m=+8191.895501530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "prometheus-metric-storage-db" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.907179 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86" (OuterVolumeSpecName: "mysql-db") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.914180 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.920471 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f60cf52-47f0-4efd-8479-64bcc13848cf" (UID: "6f60cf52-47f0-4efd-8479-64bcc13848cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.957995 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") on node \"crc\" " Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.958046 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:52 crc kubenswrapper[4823]: I1216 09:11:52.958061 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.066166 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.167035 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.171414 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.171973 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config" (OuterVolumeSpecName: "web-config") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.177696 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-config-data" (OuterVolumeSpecName: "config-data") pod "d06b91f8-1fcd-40fe-b712-0549d99258c6" (UID: "d06b91f8-1fcd-40fe-b712-0549d99258c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.178745 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/alertmanager-metric-storage-0" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="alertmanager" probeResult="failure" output="Get \"http://10.217.1.134:9093/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.178773 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-config-data" (OuterVolumeSpecName: "config-data") pod "6f60cf52-47f0-4efd-8479-64bcc13848cf" (UID: "6f60cf52-47f0-4efd-8479-64bcc13848cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.225158 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.225187 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.225453 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.225586 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86") on node "crc" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.242357 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269061 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269393 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269407 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddfca108-9270-4a16-a7bc-614b69ca9a86\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269420 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269432 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06b91f8-1fcd-40fe-b712-0549d99258c6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269443 4823 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f058bf18-c31d-4b48-a183-bb9ae9223fbe-web-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269454 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.269466 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f60cf52-47f0-4efd-8479-64bcc13848cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.280843 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4496b25e-2f39-453a-aa60-ffa74e9913c8" (UID: "4496b25e-2f39-453a-aa60-ffa74e9913c8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.292990 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.309229 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.1.52:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.309260 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5bbf6c4b7b-7qpq6" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.52:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.317677 4823 generic.go:334] "Generic (PLEG): container finished" podID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerID="aeb46928562b9e0657f49ff73daa201ffbf7d9ddbfda61724d79baa281b28aab" exitCode=0 Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.317761 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf14ab2c-212b-406f-b102-2a4b8a7a29f5","Type":"ContainerDied","Data":"aeb46928562b9e0657f49ff73daa201ffbf7d9ddbfda61724d79baa281b28aab"} Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.320401 4823 generic.go:334] "Generic (PLEG): container finished" podID="1422dc66-68e5-403d-9e01-657d83772587" containerID="7ad1800fb9f46bfb6f404756ddd7cf764c4bcb0ef59c32098fe6e1ad4f76221f" exitCode=1 Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.320445 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican75d9-account-delete-x7xds" event={"ID":"1422dc66-68e5-403d-9e01-657d83772587","Type":"ContainerDied","Data":"7ad1800fb9f46bfb6f404756ddd7cf764c4bcb0ef59c32098fe6e1ad4f76221f"} Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.321135 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican75d9-account-delete-x7xds" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.321171 4823 scope.go:117] "RemoveContainer" containerID="7ad1800fb9f46bfb6f404756ddd7cf764c4bcb0ef59c32098fe6e1ad4f76221f" Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.321471 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=barbican75d9-account-delete-x7xds_openstack(1422dc66-68e5-403d-9e01-657d83772587)\"" pod="openstack/barbican75d9-account-delete-x7xds" podUID="1422dc66-68e5-403d-9e01-657d83772587" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.329873 4823 generic.go:334] "Generic (PLEG): container finished" podID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerID="49741b7980cd55e4afdfdbd68688aadb6380c0d69a35239bfa62de2454502776" exitCode=0 Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.329945 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7","Type":"ContainerDied","Data":"49741b7980cd55e4afdfdbd68688aadb6380c0d69a35239bfa62de2454502776"} Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.344327 4823 generic.go:334] "Generic (PLEG): container finished" podID="7835251f-9e66-445c-9581-0422195cdc2b" containerID="9a424663c04d2d1f4d56826ed6ce633b0f8821a510cac5fcd0653e0828e82b8e" exitCode=1 Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.344436 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0da3b-account-delete-2w9zh" event={"ID":"7835251f-9e66-445c-9581-0422195cdc2b","Type":"ContainerDied","Data":"9a424663c04d2d1f4d56826ed6ce633b0f8821a510cac5fcd0653e0828e82b8e"} Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.346625 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0da3b-account-delete-2w9zh" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.346690 4823 scope.go:117] "RemoveContainer" containerID="9a424663c04d2d1f4d56826ed6ce633b0f8821a510cac5fcd0653e0828e82b8e" Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.346955 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=novacell0da3b-account-delete-2w9zh_openstack(7835251f-9e66-445c-9581-0422195cdc2b)\"" pod="openstack/novacell0da3b-account-delete-2w9zh" podUID="7835251f-9e66-445c-9581-0422195cdc2b" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.347818 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c90aab28-60fa-4cdc-a89a-bd041351015d","Type":"ContainerDied","Data":"fe735a54def89feb2b97130a9f912fa092f9f8ec78518b1aa8976785e19034f8"} Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.347857 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe735a54def89feb2b97130a9f912fa092f9f8ec78518b1aa8976785e19034f8" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.358167 4823 generic.go:334] "Generic (PLEG): container finished" podID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerID="9cf75a92220918199d53c40a114d1e917e97cd49891dc0fbb874a085d7c6fba4" exitCode=1 Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.358358 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell12d63-account-delete-8c88v" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359051 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359169 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif251-account-delete-dhnjg" event={"ID":"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725","Type":"ContainerDied","Data":"9cf75a92220918199d53c40a114d1e917e97cd49891dc0fbb874a085d7c6fba4"} Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359256 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359311 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359379 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359643 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement2f37-account-delete-t22qt" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359871 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapif251-account-delete-dhnjg" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.359924 4823 scope.go:117] "RemoveContainer" containerID="9cf75a92220918199d53c40a114d1e917e97cd49891dc0fbb874a085d7c6fba4" Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.360157 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=novaapif251-account-delete-dhnjg_openstack(25a0f697-45ab-48cd-b4e2-d5e8bcd3b725)\"" pod="openstack/novaapif251-account-delete-dhnjg" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.370053 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") pod \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\" (UID: \"6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.370765 4823 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4496b25e-2f39-453a-aa60-ffa74e9913c8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.425351 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" (UID: "6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8"). InnerVolumeSpecName "pvc-3de6a60f-97e5-4309-9362-0b3562144f2b". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.472405 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") pod \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\" (UID: \"f058bf18-c31d-4b48-a183-bb9ae9223fbe\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.473136 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") on node \"crc\" " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.485205 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f058bf18-c31d-4b48-a183-bb9ae9223fbe" (UID: "f058bf18-c31d-4b48-a183-bb9ae9223fbe"). InnerVolumeSpecName "pvc-b03e7a37-2b40-4cde-917a-642392a8adbd". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.496812 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.496991 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3de6a60f-97e5-4309-9362-0b3562144f2b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b") on node "crc" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.574471 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3de6a60f-97e5-4309-9362-0b3562144f2b\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.574523 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") on node \"crc\" " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.596970 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.597116 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b03e7a37-2b40-4cde-917a-642392a8adbd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd") on node "crc" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.622821 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.662384 4823 scope.go:117] "RemoveContainer" containerID="f84baa8a3ccd8709b5e794391f08e7c7bb93715323032cfa80e8f8e30a77c27c" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.675925 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-httpd-run\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.675976 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-config-data\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.676074 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-combined-ca-bundle\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.676146 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-scripts\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.676192 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-logs\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.676227 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgh9p\" (UniqueName: \"kubernetes.io/projected/60956cfa-c484-445d-af87-52713ccf4d09-kube-api-access-tgh9p\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.676269 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.676650 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.677360 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b03e7a37-2b40-4cde-917a-642392a8adbd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.677380 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.677727 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-logs" (OuterVolumeSpecName: "logs") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.681863 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60956cfa-c484-445d-af87-52713ccf4d09-kube-api-access-tgh9p" (OuterVolumeSpecName: "kube-api-access-tgh9p") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "kube-api-access-tgh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.690914 4823 scope.go:117] "RemoveContainer" containerID="4b43c6a9df3e6ee0304d2e089fa50a0bfce77767a4afdf5ac55c13501c52cd9d" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.712339 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-scripts" (OuterVolumeSpecName: "scripts") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.726432 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.759704 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs podName:60956cfa-c484-445d-af87-52713ccf4d09 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:54.259667005 +0000 UTC m=+8192.748233218 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09") : error deleting /var/lib/kubelet/pods/60956cfa-c484-445d-af87-52713ccf4d09/volume-subpaths: remove /var/lib/kubelet/pods/60956cfa-c484-445d-af87-52713ccf4d09/volume-subpaths: no such file or directory Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.765464 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-config-data" (OuterVolumeSpecName: "config-data") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.778973 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.779002 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.779012 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.779020 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60956cfa-c484-445d-af87-52713ccf4d09-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.779355 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgh9p\" (UniqueName: \"kubernetes.io/projected/60956cfa-c484-445d-af87-52713ccf4d09-kube-api-access-tgh9p\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.779314 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.779415 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts podName:7835251f-9e66-445c-9581-0422195cdc2b nodeName:}" failed. No retries permitted until 2025-12-16 09:11:55.779391883 +0000 UTC m=+8194.267958006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts") pod "novacell0da3b-account-delete-2w9zh" (UID: "7835251f-9e66-445c-9581-0422195cdc2b") : configmap "openstack-scripts" not found Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.788354 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9 is running failed: container process not found" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.788977 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9 is running failed: container process not found" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.789411 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9 is running failed: container process not found" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.789466 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerName="nova-cell1-conductor-conductor" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.790181 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" path="/var/lib/kubelet/pods/6ebc4a0e-1b85-400b-bc10-5d216d7431fb/volumes" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.793636 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" path="/var/lib/kubelet/pods/868b7d1a-5039-4d72-9a41-d8e57b1df5d4/volumes" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.852163 4823 scope.go:117] "RemoveContainer" containerID="9a93aa1b4c75390a6ef3fa58db9fd87ea0983cf80d04ef79278da7ac5a212dbe" Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.887107 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.887178 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts podName:1422dc66-68e5-403d-9e01-657d83772587 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:55.887160046 +0000 UTC m=+8194.375726169 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts") pod "barbican75d9-account-delete-x7xds" (UID: "1422dc66-68e5-403d-9e01-657d83772587") : configmap "openstack-scripts" not found Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.887555 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:53 crc kubenswrapper[4823]: E1216 09:11:53.887594 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts podName:25a0f697-45ab-48cd-b4e2-d5e8bcd3b725 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:55.887582848 +0000 UTC m=+8194.376148971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts") pod "novaapif251-account-delete-dhnjg" (UID: "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725") : configmap "openstack-scripts" not found Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.972670 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4d28-account-delete-cb8qx"] Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.974845 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 09:11:53 crc kubenswrapper[4823]: I1216 09:11:53.993100 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement2f37-account-delete-t22qt"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.011772 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement2f37-account-delete-t22qt"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.015660 4823 scope.go:117] "RemoveContainer" containerID="a5d1441362a86b2c2d9982c87441fe14ef4f629cfb562ad9b5a4d225170c8fcd" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.016064 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.023965 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-q8qt2"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.043073 4823 scope.go:117] "RemoveContainer" containerID="e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.045631 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f\": container with ID starting with e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f not found: ID does not exist" containerID="e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.045709 4823 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f\": rpc error: code = NotFound desc = could not find container \"e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f\": container with ID starting with e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f not found: ID does not exist" containerID="e6f71a18226db71c5f87b17ab03484664718f29d8f13ce063bf0655bdf85162f" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.045745 4823 scope.go:117] "RemoveContainer" containerID="3a90c47a8cf666d0191a548e0fbb974d94f6605f8a15aeb046469781c6299ff1" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.056695 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-q8qt2"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.057421 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.095681 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-f1cb-account-create-update-ppr8d"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.099181 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.099788 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.104742 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfk7d\" (UniqueName: \"kubernetes.io/projected/7a50033a-9a6e-42e3-ac23-de2a24654b0f-kube-api-access-nfk7d\") pod \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.104807 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-combined-ca-bundle\") pod \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.104853 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data-custom\") pod \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.104909 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data\") pod \"f8b8d93d-24db-4382-9077-7404605c7cf1\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105019 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-internal-tls-certs\") pod \"f8b8d93d-24db-4382-9077-7404605c7cf1\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105113 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c54ba6-36e8-4608-ab54-965ab4bdcef2-logs\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105139 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-public-tls-certs\") pod \"f8b8d93d-24db-4382-9077-7404605c7cf1\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105181 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-config-data\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105210 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-combined-ca-bundle\") pod \"f8b8d93d-24db-4382-9077-7404605c7cf1\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105282 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-combined-ca-bundle\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105318 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2z4\" (UniqueName: \"kubernetes.io/projected/f8b8d93d-24db-4382-9077-7404605c7cf1-kube-api-access-xr2z4\") pod \"f8b8d93d-24db-4382-9077-7404605c7cf1\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105349 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a50033a-9a6e-42e3-ac23-de2a24654b0f-logs\") pod \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105389 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-internal-tls-certs\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105454 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-scripts\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105473 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data\") pod \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\" (UID: \"7a50033a-9a6e-42e3-ac23-de2a24654b0f\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105513 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9vvx\" (UniqueName: \"kubernetes.io/projected/44c54ba6-36e8-4608-ab54-965ab4bdcef2-kube-api-access-x9vvx\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105537 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-public-tls-certs\") pod \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\" (UID: \"44c54ba6-36e8-4608-ab54-965ab4bdcef2\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.105567 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data-custom\") pod \"f8b8d93d-24db-4382-9077-7404605c7cf1\" (UID: \"f8b8d93d-24db-4382-9077-7404605c7cf1\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.123665 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a50033a-9a6e-42e3-ac23-de2a24654b0f-logs" (OuterVolumeSpecName: "logs") pod "7a50033a-9a6e-42e3-ac23-de2a24654b0f" (UID: "7a50033a-9a6e-42e3-ac23-de2a24654b0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.127378 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c54ba6-36e8-4608-ab54-965ab4bdcef2-logs" (OuterVolumeSpecName: "logs") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.128438 4823 scope.go:117] "RemoveContainer" containerID="0c5789344ac78b72b59aa20ebab8bdaa03ec7af24691842901db5f6dd86d3f14" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.135012 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-scripts" (OuterVolumeSpecName: "scripts") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.138850 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f8b8d93d-24db-4382-9077-7404605c7cf1" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.142865 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.151619 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.152651 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-f1cb-account-create-update-ppr8d"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.153106 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.165164 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a50033a-9a6e-42e3-ac23-de2a24654b0f-kube-api-access-nfk7d" (OuterVolumeSpecName: "kube-api-access-nfk7d") pod "7a50033a-9a6e-42e3-ac23-de2a24654b0f" (UID: "7a50033a-9a6e-42e3-ac23-de2a24654b0f"). InnerVolumeSpecName "kube-api-access-nfk7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.169963 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heatf1cb-account-delete-2mdnl"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.174289 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.174672 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.176561 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c54ba6-36e8-4608-ab54-965ab4bdcef2-kube-api-access-x9vvx" (OuterVolumeSpecName: "kube-api-access-x9vvx") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "kube-api-access-x9vvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.178212 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b8d93d-24db-4382-9077-7404605c7cf1-kube-api-access-xr2z4" (OuterVolumeSpecName: "kube-api-access-xr2z4") pod "f8b8d93d-24db-4382-9077-7404605c7cf1" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1"). InnerVolumeSpecName "kube-api-access-xr2z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207573 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207624 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x9qj\" (UniqueName: \"kubernetes.io/projected/7f35ecc1-21e4-461e-91d3-3da96745fed6-kube-api-access-4x9qj\") pod \"7f35ecc1-21e4-461e-91d3-3da96745fed6\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207658 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-config-data\") pod \"2a40068b-87bc-4af6-862d-ad33696041b3\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207716 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddblq\" (UniqueName: \"kubernetes.io/projected/5729be98-e3b4-42bd-92a6-913d63da1de3-kube-api-access-ddblq\") pod \"5729be98-e3b4-42bd-92a6-913d63da1de3\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207737 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n9wz\" (UniqueName: \"kubernetes.io/projected/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-kube-api-access-4n9wz\") pod \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207762 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data\") pod \"341f00a5-410a-4656-876e-a6b0cfe2a4df\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207787 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data-custom\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207806 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-scripts\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207829 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5729be98-e3b4-42bd-92a6-913d63da1de3-operator-scripts\") pod \"5729be98-e3b4-42bd-92a6-913d63da1de3\" (UID: \"5729be98-e3b4-42bd-92a6-913d63da1de3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.207987 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-nova-metadata-tls-certs\") pod \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208019 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341f00a5-410a-4656-876e-a6b0cfe2a4df-logs\") pod \"341f00a5-410a-4656-876e-a6b0cfe2a4df\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208071 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngjlq\" (UniqueName: \"kubernetes.io/projected/341f00a5-410a-4656-876e-a6b0cfe2a4df-kube-api-access-ngjlq\") pod \"341f00a5-410a-4656-876e-a6b0cfe2a4df\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208109 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f5097e-d643-4598-9d06-39f14f913291-logs\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208137 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data-custom\") pod \"341f00a5-410a-4656-876e-a6b0cfe2a4df\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208160 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-public-tls-certs\") pod \"7f35ecc1-21e4-461e-91d3-3da96745fed6\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208174 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91f5097e-d643-4598-9d06-39f14f913291-etc-machine-id\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208188 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-combined-ca-bundle\") pod \"2a40068b-87bc-4af6-862d-ad33696041b3\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208210 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rf9l\" (UniqueName: \"kubernetes.io/projected/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-kube-api-access-8rf9l\") pod \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208235 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-config-data\") pod \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208284 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-combined-ca-bundle\") pod \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\" (UID: \"9fd92bc3-eaf0-4217-bcac-dd8f41db9edf\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208301 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-public-tls-certs\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208328 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jcsf\" (UniqueName: \"kubernetes.io/projected/2a40068b-87bc-4af6-862d-ad33696041b3-kube-api-access-6jcsf\") pod \"2a40068b-87bc-4af6-862d-ad33696041b3\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208343 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-internal-tls-certs\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208358 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data-custom\") pod \"7f35ecc1-21e4-461e-91d3-3da96745fed6\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208459 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-internal-tls-certs\") pod \"2a40068b-87bc-4af6-862d-ad33696041b3\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208475 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a40068b-87bc-4af6-862d-ad33696041b3-logs\") pod \"2a40068b-87bc-4af6-862d-ad33696041b3\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208501 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-combined-ca-bundle\") pod \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208541 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-config-data\") pod \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208562 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-combined-ca-bundle\") pod \"7f35ecc1-21e4-461e-91d3-3da96745fed6\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208582 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-combined-ca-bundle\") pod \"341f00a5-410a-4656-876e-a6b0cfe2a4df\" (UID: \"341f00a5-410a-4656-876e-a6b0cfe2a4df\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208603 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-combined-ca-bundle\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208689 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-internal-tls-certs\") pod \"7f35ecc1-21e4-461e-91d3-3da96745fed6\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.208712 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-public-tls-certs\") pod \"2a40068b-87bc-4af6-862d-ad33696041b3\" (UID: \"2a40068b-87bc-4af6-862d-ad33696041b3\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.209437 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-logs\") pod \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\" (UID: \"3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.210222 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mvb\" (UniqueName: \"kubernetes.io/projected/91f5097e-d643-4598-9d06-39f14f913291-kube-api-access-w4mvb\") pod \"91f5097e-d643-4598-9d06-39f14f913291\" (UID: \"91f5097e-d643-4598-9d06-39f14f913291\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.210298 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data\") pod \"7f35ecc1-21e4-461e-91d3-3da96745fed6\" (UID: \"7f35ecc1-21e4-461e-91d3-3da96745fed6\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.210989 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.211009 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9vvx\" (UniqueName: \"kubernetes.io/projected/44c54ba6-36e8-4608-ab54-965ab4bdcef2-kube-api-access-x9vvx\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.211036 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.211046 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfk7d\" (UniqueName: \"kubernetes.io/projected/7a50033a-9a6e-42e3-ac23-de2a24654b0f-kube-api-access-nfk7d\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.211055 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44c54ba6-36e8-4608-ab54-965ab4bdcef2-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.211063 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2z4\" (UniqueName: \"kubernetes.io/projected/f8b8d93d-24db-4382-9077-7404605c7cf1-kube-api-access-xr2z4\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.211072 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a50033a-9a6e-42e3-ac23-de2a24654b0f-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.213530 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5729be98-e3b4-42bd-92a6-913d63da1de3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5729be98-e3b4-42bd-92a6-913d63da1de3" (UID: "5729be98-e3b4-42bd-92a6-913d63da1de3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.216090 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a50033a-9a6e-42e3-ac23-de2a24654b0f" (UID: "7a50033a-9a6e-42e3-ac23-de2a24654b0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.216872 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91f5097e-d643-4598-9d06-39f14f913291-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.217278 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341f00a5-410a-4656-876e-a6b0cfe2a4df-logs" (OuterVolumeSpecName: "logs") pod "341f00a5-410a-4656-876e-a6b0cfe2a4df" (UID: "341f00a5-410a-4656-876e-a6b0cfe2a4df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.218582 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91f5097e-d643-4598-9d06-39f14f913291-logs" (OuterVolumeSpecName: "logs") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.222554 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-logs" (OuterVolumeSpecName: "logs") pod "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" (UID: "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.228591 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a40068b-87bc-4af6-862d-ad33696041b3-logs" (OuterVolumeSpecName: "logs") pod "2a40068b-87bc-4af6-862d-ad33696041b3" (UID: "2a40068b-87bc-4af6-862d-ad33696041b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.230201 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.234567 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8b8d93d-24db-4382-9077-7404605c7cf1" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.234689 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-kube-api-access-8rf9l" (OuterVolumeSpecName: "kube-api-access-8rf9l") pod "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" (UID: "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a"). InnerVolumeSpecName "kube-api-access-8rf9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.239274 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-kube-api-access-4n9wz" (OuterVolumeSpecName: "kube-api-access-4n9wz") pod "9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" (UID: "9fd92bc3-eaf0-4217-bcac-dd8f41db9edf"). InnerVolumeSpecName "kube-api-access-4n9wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.240259 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341f00a5-410a-4656-876e-a6b0cfe2a4df-kube-api-access-ngjlq" (OuterVolumeSpecName: "kube-api-access-ngjlq") pod "341f00a5-410a-4656-876e-a6b0cfe2a4df" (UID: "341f00a5-410a-4656-876e-a6b0cfe2a4df"). InnerVolumeSpecName "kube-api-access-ngjlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.240294 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.241226 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a40068b-87bc-4af6-862d-ad33696041b3-kube-api-access-6jcsf" (OuterVolumeSpecName: "kube-api-access-6jcsf") pod "2a40068b-87bc-4af6-862d-ad33696041b3" (UID: "2a40068b-87bc-4af6-862d-ad33696041b3"). InnerVolumeSpecName "kube-api-access-6jcsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.245516 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276245 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-scripts" (OuterVolumeSpecName: "scripts") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276261 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "341f00a5-410a-4656-876e-a6b0cfe2a4df" (UID: "341f00a5-410a-4656-876e-a6b0cfe2a4df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276365 4823 scope.go:117] "RemoveContainer" containerID="9e972ac6360c9fcb45fd20ef40bf7e8972136fa235df75bc1dfcacbfb25e23ed" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276369 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f5097e-d643-4598-9d06-39f14f913291-kube-api-access-w4mvb" (OuterVolumeSpecName: "kube-api-access-w4mvb") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "kube-api-access-w4mvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276405 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f35ecc1-21e4-461e-91d3-3da96745fed6-kube-api-access-4x9qj" (OuterVolumeSpecName: "kube-api-access-4x9qj") pod "7f35ecc1-21e4-461e-91d3-3da96745fed6" (UID: "7f35ecc1-21e4-461e-91d3-3da96745fed6"). InnerVolumeSpecName "kube-api-access-4x9qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276431 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5729be98-e3b4-42bd-92a6-913d63da1de3-kube-api-access-ddblq" (OuterVolumeSpecName: "kube-api-access-ddblq") pod "5729be98-e3b4-42bd-92a6-913d63da1de3" (UID: "5729be98-e3b4-42bd-92a6-913d63da1de3"). InnerVolumeSpecName "kube-api-access-ddblq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.276514 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f35ecc1-21e4-461e-91d3-3da96745fed6" (UID: "7f35ecc1-21e4-461e-91d3-3da96745fed6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.291252 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.303466 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.304702 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.314009 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvklf\" (UniqueName: \"kubernetes.io/projected/9b75df4d-61d8-4913-bea9-018339e8e2a8-kube-api-access-kvklf\") pod \"9b75df4d-61d8-4913-bea9-018339e8e2a8\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.314220 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b75df4d-61d8-4913-bea9-018339e8e2a8-operator-scripts\") pod \"9b75df4d-61d8-4913-bea9-018339e8e2a8\" (UID: \"9b75df4d-61d8-4913-bea9-018339e8e2a8\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.314261 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs\") pod \"60956cfa-c484-445d-af87-52713ccf4d09\" (UID: \"60956cfa-c484-445d-af87-52713ccf4d09\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.314998 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315038 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mvb\" (UniqueName: \"kubernetes.io/projected/91f5097e-d643-4598-9d06-39f14f913291-kube-api-access-w4mvb\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315049 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x9qj\" (UniqueName: \"kubernetes.io/projected/7f35ecc1-21e4-461e-91d3-3da96745fed6-kube-api-access-4x9qj\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315058 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddblq\" (UniqueName: \"kubernetes.io/projected/5729be98-e3b4-42bd-92a6-913d63da1de3-kube-api-access-ddblq\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315068 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n9wz\" (UniqueName: \"kubernetes.io/projected/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-kube-api-access-4n9wz\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315077 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315086 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315095 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5729be98-e3b4-42bd-92a6-913d63da1de3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315103 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315111 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/341f00a5-410a-4656-876e-a6b0cfe2a4df-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315122 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngjlq\" (UniqueName: \"kubernetes.io/projected/341f00a5-410a-4656-876e-a6b0cfe2a4df-kube-api-access-ngjlq\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315130 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f5097e-d643-4598-9d06-39f14f913291-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315138 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315146 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91f5097e-d643-4598-9d06-39f14f913291-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315155 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rf9l\" (UniqueName: \"kubernetes.io/projected/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-kube-api-access-8rf9l\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315164 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315172 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jcsf\" (UniqueName: \"kubernetes.io/projected/2a40068b-87bc-4af6-862d-ad33696041b3-kube-api-access-6jcsf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315181 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.315189 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a40068b-87bc-4af6-862d-ad33696041b3-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.316066 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.316312 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b75df4d-61d8-4913-bea9-018339e8e2a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b75df4d-61d8-4913-bea9-018339e8e2a8" (UID: "9b75df4d-61d8-4913-bea9-018339e8e2a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.318160 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.334786 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.336898 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.354173 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a50033a-9a6e-42e3-ac23-de2a24654b0f" (UID: "7a50033a-9a6e-42e3-ac23-de2a24654b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.359117 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell12d63-account-delete-8c88v"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.368340 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60956cfa-c484-445d-af87-52713ccf4d09" (UID: "60956cfa-c484-445d-af87-52713ccf4d09"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.368581 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b75df4d-61d8-4913-bea9-018339e8e2a8-kube-api-access-kvklf" (OuterVolumeSpecName: "kube-api-access-kvklf") pod "9b75df4d-61d8-4913-bea9-018339e8e2a8" (UID: "9b75df4d-61d8-4913-bea9-018339e8e2a8"). InnerVolumeSpecName "kube-api-access-kvklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.372107 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell12d63-account-delete-8c88v"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.382685 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.414478 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.416854 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4d4t\" (UniqueName: \"kubernetes.io/projected/d6361c12-5d54-4919-aafe-4ac9b88e8c20-kube-api-access-m4d4t\") pod \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.416893 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nn9r\" (UniqueName: \"kubernetes.io/projected/d9880fe3-977f-473b-84c9-2cb6f65d588d-kube-api-access-2nn9r\") pod \"d9880fe3-977f-473b-84c9-2cb6f65d588d\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.416931 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-certs\") pod \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.416993 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9880fe3-977f-473b-84c9-2cb6f65d588d-operator-scripts\") pod \"d9880fe3-977f-473b-84c9-2cb6f65d588d\" (UID: \"d9880fe3-977f-473b-84c9-2cb6f65d588d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.417037 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6361c12-5d54-4919-aafe-4ac9b88e8c20-operator-scripts\") pod \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\" (UID: \"d6361c12-5d54-4919-aafe-4ac9b88e8c20\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.417435 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-combined-ca-bundle\") pod \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.417544 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-config\") pod \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.417612 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxvv6\" (UniqueName: \"kubernetes.io/projected/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-api-access-dxvv6\") pod \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\" (UID: \"3ee97b1f-ce61-45ef-97e1-642cc13ef521\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.417918 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6361c12-5d54-4919-aafe-4ac9b88e8c20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6361c12-5d54-4919-aafe-4ac9b88e8c20" (UID: "d6361c12-5d54-4919-aafe-4ac9b88e8c20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418321 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9880fe3-977f-473b-84c9-2cb6f65d588d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9880fe3-977f-473b-84c9-2cb6f65d588d" (UID: "d9880fe3-977f-473b-84c9-2cb6f65d588d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418785 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvklf\" (UniqueName: \"kubernetes.io/projected/9b75df4d-61d8-4913-bea9-018339e8e2a8-kube-api-access-kvklf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418810 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418824 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b75df4d-61d8-4913-bea9-018339e8e2a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418835 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9880fe3-977f-473b-84c9-2cb6f65d588d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418867 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60956cfa-c484-445d-af87-52713ccf4d09-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.418880 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6361c12-5d54-4919-aafe-4ac9b88e8c20-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.420985 4823 scope.go:117] "RemoveContainer" containerID="f0c094f39df35eb8a22e0462a3cdfe3e11e037d6479ef37e2519d6111a7eadb9" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.439155 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.440445 4823 scope.go:117] "RemoveContainer" containerID="5212d6dbaaa8dbff11343b20aba28ce9285acc49da6ca4a23b65acdd32b790c4" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.441139 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8/ovsdbserver-sb/0.log" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.443259 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.448266 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-api-access-dxvv6" (OuterVolumeSpecName: "kube-api-access-dxvv6") pod "3ee97b1f-ce61-45ef-97e1-642cc13ef521" (UID: "3ee97b1f-ce61-45ef-97e1-642cc13ef521"). InnerVolumeSpecName "kube-api-access-dxvv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.450002 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.450976 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican75d9-account-delete-x7xds" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.451485 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.451500 4823 scope.go:117] "RemoveContainer" containerID="7ad1800fb9f46bfb6f404756ddd7cf764c4bcb0ef59c32098fe6e1ad4f76221f" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.452107 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=barbican75d9-account-delete-x7xds_openstack(1422dc66-68e5-403d-9e01-657d83772587)\"" pod="openstack/barbican75d9-account-delete-x7xds" podUID="1422dc66-68e5-403d-9e01-657d83772587" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.452492 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9880fe3-977f-473b-84c9-2cb6f65d588d-kube-api-access-2nn9r" (OuterVolumeSpecName: "kube-api-access-2nn9r") pod "d9880fe3-977f-473b-84c9-2cb6f65d588d" (UID: "d9880fe3-977f-473b-84c9-2cb6f65d588d"). InnerVolumeSpecName "kube-api-access-2nn9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.454685 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.478400 4823 generic.go:334] "Generic (PLEG): container finished" podID="34b62d72-52bc-4a7d-806c-52784476a695" containerID="8d06e3290f385b94ea1f27ba2ac87da8630c1786706af2c1115d30b9f2ec0dd7" exitCode=0 Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.479149 4823 generic.go:334] "Generic (PLEG): container finished" podID="34b62d72-52bc-4a7d-806c-52784476a695" containerID="34e5dedd3f4eeec3f9dfff67c9bc1eb3a9095430a483155bc57be31d63b3460b" exitCode=0 Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.478755 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerDied","Data":"8d06e3290f385b94ea1f27ba2ac87da8630c1786706af2c1115d30b9f2ec0dd7"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.479253 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerDied","Data":"34e5dedd3f4eeec3f9dfff67c9bc1eb3a9095430a483155bc57be31d63b3460b"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.483354 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6361c12-5d54-4919-aafe-4ac9b88e8c20-kube-api-access-m4d4t" (OuterVolumeSpecName: "kube-api-access-m4d4t") pod "d6361c12-5d54-4919-aafe-4ac9b88e8c20" (UID: "d6361c12-5d54-4919-aafe-4ac9b88e8c20"). InnerVolumeSpecName "kube-api-access-m4d4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.487449 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.494510 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.495749 4823 scope.go:117] "RemoveContainer" containerID="d5d0a5367608ca12ad2cdc7265c39c5cf0d8d651f757751f94e87eed288e2029" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.496004 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.504773 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.507786 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.515960 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7","Type":"ContainerDied","Data":"86a94d8cfb482dfbcf39e835cfac93099d2b5eec84a5a12e62c12d196edc07db"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.516077 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.519973 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-galera-tls-certs\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520056 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520091 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kolla-config\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520118 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-erlang-cookie-secret\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520133 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-server-conf\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520150 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-plugins-conf\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520194 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-etc-machine-id\") pod \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520223 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqpfd\" (UniqueName: \"kubernetes.io/projected/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-kube-api-access-dqpfd\") pod \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520260 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-logs\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520668 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520703 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520733 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmmw\" (UniqueName: \"kubernetes.io/projected/c90aab28-60fa-4cdc-a89a-bd041351015d-kube-api-access-glmmw\") pod \"c90aab28-60fa-4cdc-a89a-bd041351015d\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520765 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-generated\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520781 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-plugins-conf\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520802 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-plugins\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.520827 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96vcg\" (UniqueName: \"kubernetes.io/projected/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-kube-api-access-96vcg\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.522550 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-logs" (OuterVolumeSpecName: "logs") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.522959 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523196 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523229 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cffdbd32-0155-4dd0-897d-9e406fd5e2ee" (UID: "cffdbd32-0155-4dd0-897d-9e406fd5e2ee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523770 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523808 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-pod-info\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523838 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6bpx\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-kube-api-access-k6bpx\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523858 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-confd\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523874 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-operator-scripts\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg96c\" (UniqueName: \"kubernetes.io/projected/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kube-api-access-dg96c\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523909 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc7hd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-kube-api-access-jc7hd\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523950 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-erlang-cookie\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.523995 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-scripts\") pod \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.524074 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-kolla-config\") pod \"c90aab28-60fa-4cdc-a89a-bd041351015d\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.524092 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-memcached-tls-certs\") pod \"c90aab28-60fa-4cdc-a89a-bd041351015d\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.524123 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-combined-ca-bundle\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.527458 4823 generic.go:334] "Generic (PLEG): container finished" podID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerID="721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71" exitCode=0 Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.527540 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e511eaa-334a-4fe3-ab41-e66d4a53a931","Type":"ContainerDied","Data":"721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.527565 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e511eaa-334a-4fe3-ab41-e66d4a53a931","Type":"ContainerDied","Data":"d917a2b413c2c2bec575fd904721d3c821e8a3e96601e3f4ffbbefdd622d935e"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.527575 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.527641 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.528526 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.528936 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.532954 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533011 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-erlang-cookie-secret\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533050 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533082 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-erlang-cookie\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533111 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data\") pod \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533130 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-combined-ca-bundle\") pod \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533157 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-confd\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533203 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-plugins\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533226 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-combined-ca-bundle\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533248 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-config-data\") pod \"c90aab28-60fa-4cdc-a89a-bd041351015d\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533264 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-pod-info\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533293 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-internal-tls-certs\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533312 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533332 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-server-conf\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533362 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-combined-ca-bundle\") pod \"c90aab28-60fa-4cdc-a89a-bd041351015d\" (UID: \"c90aab28-60fa-4cdc-a89a-bd041351015d\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533381 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data-custom\") pod \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\" (UID: \"cffdbd32-0155-4dd0-897d-9e406fd5e2ee\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533412 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-default\") pod \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\" (UID: \"0e511eaa-334a-4fe3-ab41-e66d4a53a931\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533432 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-tls\") pod \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\" (UID: \"bf14ab2c-212b-406f-b102-2a4b8a7a29f5\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533454 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data-custom\") pod \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\" (UID: \"0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.533500 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-tls\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534185 4823 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534203 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxvv6\" (UniqueName: \"kubernetes.io/projected/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-api-access-dxvv6\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534213 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534224 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534237 4823 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534245 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534254 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4d4t\" (UniqueName: \"kubernetes.io/projected/d6361c12-5d54-4919-aafe-4ac9b88e8c20-kube-api-access-m4d4t\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534265 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nn9r\" (UniqueName: \"kubernetes.io/projected/d9880fe3-977f-473b-84c9-2cb6f65d588d-kube-api-access-2nn9r\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534273 4823 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.534283 4823 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.540844 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c90aab28-60fa-4cdc-a89a-bd041351015d" (UID: "c90aab28-60fa-4cdc-a89a-bd041351015d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.562003 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronc394-account-delete-lkcfh"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.588645 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-kube-api-access-jc7hd" (OuterVolumeSpecName: "kube-api-access-jc7hd") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "kube-api-access-jc7hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.588859 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.590659 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.593315 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-kube-api-access-96vcg" (OuterVolumeSpecName: "kube-api-access-96vcg") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "kube-api-access-96vcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.599084 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-config-data" (OuterVolumeSpecName: "config-data") pod "2a40068b-87bc-4af6-862d-ad33696041b3" (UID: "2a40068b-87bc-4af6-862d-ad33696041b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.599305 4823 scope.go:117] "RemoveContainer" containerID="ab5e8a8e527a7f55ae463d150528bea80ab575a8ef40cde1febdcfd7069b95a0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.604420 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-config-data" (OuterVolumeSpecName: "config-data") pod "c90aab28-60fa-4cdc-a89a-bd041351015d" (UID: "c90aab28-60fa-4cdc-a89a-bd041351015d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.612736 4823 generic.go:334] "Generic (PLEG): container finished" podID="e5b08afc-bfe3-4938-ac42-3781d1290201" containerID="5a71791d0d178cb3e2f0ca5f41f8f5be586775d58f1659933d5697c3e1b3e765" exitCode=0 Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.612817 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b64c64d55-q7zxm" event={"ID":"e5b08afc-bfe3-4938-ac42-3781d1290201","Type":"ContainerDied","Data":"5a71791d0d178cb3e2f0ca5f41f8f5be586775d58f1659933d5697c3e1b3e765"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.613198 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-pod-info" (OuterVolumeSpecName: "pod-info") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.613288 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.613520 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-kube-api-access-k6bpx" (OuterVolumeSpecName: "kube-api-access-k6bpx") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "kube-api-access-k6bpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.613720 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.613779 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-kube-api-access-dqpfd" (OuterVolumeSpecName: "kube-api-access-dqpfd") pod "cffdbd32-0155-4dd0-897d-9e406fd5e2ee" (UID: "cffdbd32-0155-4dd0-897d-9e406fd5e2ee"). InnerVolumeSpecName "kube-api-access-dqpfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.613859 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90aab28-60fa-4cdc-a89a-bd041351015d-kube-api-access-glmmw" (OuterVolumeSpecName: "kube-api-access-glmmw") pod "c90aab28-60fa-4cdc-a89a-bd041351015d" (UID: "c90aab28-60fa-4cdc-a89a-bd041351015d"). InnerVolumeSpecName "kube-api-access-glmmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.617903 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.624701 4823 scope.go:117] "RemoveContainer" containerID="49741b7980cd55e4afdfdbd68688aadb6380c0d69a35239bfa62de2454502776" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.626957 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.634130 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.647519 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928" (OuterVolumeSpecName: "persistence") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.653338 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-scripts" (OuterVolumeSpecName: "scripts") pod "cffdbd32-0155-4dd0-897d-9e406fd5e2ee" (UID: "cffdbd32-0155-4dd0-897d-9e406fd5e2ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.656913 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data" (OuterVolumeSpecName: "config-data") pod "f8b8d93d-24db-4382-9077-7404605c7cf1" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657532 4823 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657552 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657562 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657572 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657581 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657589 4823 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657598 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqpfd\" (UniqueName: \"kubernetes.io/projected/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-kube-api-access-dqpfd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657626 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") on node \"crc\" " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657636 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmmw\" (UniqueName: \"kubernetes.io/projected/c90aab28-60fa-4cdc-a89a-bd041351015d-kube-api-access-glmmw\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657645 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657653 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96vcg\" (UniqueName: \"kubernetes.io/projected/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-kube-api-access-96vcg\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657661 4823 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657669 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6bpx\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-kube-api-access-k6bpx\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657678 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e511eaa-334a-4fe3-ab41-e66d4a53a931-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657687 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc7hd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-kube-api-access-jc7hd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657695 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657705 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657717 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.657727 4823 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c90aab28-60fa-4cdc-a89a-bd041351015d-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.658673 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kube-api-access-dg96c" (OuterVolumeSpecName: "kube-api-access-dg96c") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "kube-api-access-dg96c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.669477 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderea34-account-delete-pjmhc"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.674882 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.675326 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.675487 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf14ab2c-212b-406f-b102-2a4b8a7a29f5","Type":"ContainerDied","Data":"c294eafbea9c890227fd9ce45cfa0707928e1be74231bf94729d59d733b88a4a"} Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.675548 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-865d4cf8d6-bwj5n" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.675715 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.675863 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc394-account-delete-lkcfh" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.677836 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data" (OuterVolumeSpecName: "config-data") pod "7f35ecc1-21e4-461e-91d3-3da96745fed6" (UID: "7f35ecc1-21e4-461e-91d3-3da96745fed6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.678012 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderea34-account-delete-pjmhc" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.680570 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-ddfd865c7-nhsh6" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.681124 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fcf4dff7-84zz6" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.681337 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.681551 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4d28-account-delete-cb8qx" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682799 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cffdbd32-0155-4dd0-897d-9e406fd5e2ee" (UID: "cffdbd32-0155-4dd0-897d-9e406fd5e2ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682837 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682892 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7454ff977b-h6fwh" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682929 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682937 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapif251-account-delete-dhnjg" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682977 4823 scope.go:117] "RemoveContainer" containerID="9cf75a92220918199d53c40a114d1e917e97cd49891dc0fbb874a085d7c6fba4" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.683133 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.683197 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=novaapif251-account-delete-dhnjg_openstack(25a0f697-45ab-48cd-b4e2-d5e8bcd3b725)\"" pod="openstack/novaapif251-account-delete-dhnjg" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.682898 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8589448fc-qj569" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.683390 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-pod-info" (OuterVolumeSpecName: "pod-info") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.683602 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523 is running failed: container process not found" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.684096 4823 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0da3b-account-delete-2w9zh" secret="" err="secret \"galera-openstack-dockercfg-8t746\" not found" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.684154 4823 scope.go:117] "RemoveContainer" containerID="9a424663c04d2d1f4d56826ed6ce633b0f8821a510cac5fcd0653e0828e82b8e" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.684398 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=novacell0da3b-account-delete-2w9zh_openstack(7835251f-9e66-445c-9581-0422195cdc2b)\"" pod="openstack/novacell0da3b-account-delete-2w9zh" podUID="7835251f-9e66-445c-9581-0422195cdc2b" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.685191 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523 is running failed: container process not found" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.686197 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523 is running failed: container process not found" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.686243 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.714100 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.714188 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.747774 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderea34-account-delete-pjmhc"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775751 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775800 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg96c\" (UniqueName: \"kubernetes.io/projected/0e511eaa-334a-4fe3-ab41-e66d4a53a931-kube-api-access-dg96c\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775816 4823 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775830 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775851 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775867 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.775879 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.775975 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.776067 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:58.776035415 +0000 UTC m=+8197.264601558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.784137 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d" (OuterVolumeSpecName: "mysql-db") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: E1216 09:11:54.851595 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47 podName:cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:55.351560979 +0000 UTC m=+8193.840127102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "persistence" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.884231 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.895170 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.895341 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928") on node "crc" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.896438 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") on node \"crc\" " Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.896468 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.896484 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c2f342f-45ca-4f81-be5a-cc9b87688928\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.897499 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ee97b1f-ce61-45ef-97e1-642cc13ef521" (UID: "3ee97b1f-ce61-45ef-97e1-642cc13ef521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.914139 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican75d9-account-delete-x7xds"] Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.922552 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f35ecc1-21e4-461e-91d3-3da96745fed6" (UID: "7f35ecc1-21e4-461e-91d3-3da96745fed6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.926713 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data" (OuterVolumeSpecName: "config-data") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.956434 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:54 crc kubenswrapper[4823]: I1216 09:11:54.969328 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a40068b-87bc-4af6-862d-ad33696041b3" (UID: "2a40068b-87bc-4af6-862d-ad33696041b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.002572 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0da3b-account-delete-2w9zh"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.013550 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.013581 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.013591 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.013599 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.013608 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.026013 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" (UID: "9fd92bc3-eaf0-4217-bcac-dd8f41db9edf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.027946 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" (UID: "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.072393 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapif251-account-delete-dhnjg"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.089208 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2a40068b-87bc-4af6-862d-ad33696041b3" (UID: "2a40068b-87bc-4af6-862d-ad33696041b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.096322 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data" (OuterVolumeSpecName: "config-data") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.104317 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.1.137:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.110797 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data" (OuterVolumeSpecName: "config-data") pod "7a50033a-9a6e-42e3-ac23-de2a24654b0f" (UID: "7a50033a-9a6e-42e3-ac23-de2a24654b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.112540 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f8b8d93d-24db-4382-9077-7404605c7cf1" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.112878 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.115890 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.115918 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a50033a-9a6e-42e3-ac23-de2a24654b0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.115931 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.115945 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.115957 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.115968 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.125080 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.142178 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.143981 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-config-data" (OuterVolumeSpecName: "config-data") pod "9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" (UID: "9fd92bc3-eaf0-4217-bcac-dd8f41db9edf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.151333 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.181674 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2a40068b-87bc-4af6-862d-ad33696041b3" (UID: "2a40068b-87bc-4af6-862d-ad33696041b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.186586 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-config-data" (OuterVolumeSpecName: "config-data") pod "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" (UID: "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.208420 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.217529 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.217554 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.217565 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.217576 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.217584 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.217592 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a40068b-87bc-4af6-862d-ad33696041b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.220183 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data" (OuterVolumeSpecName: "config-data") pod "341f00a5-410a-4656-876e-a6b0cfe2a4df" (UID: "341f00a5-410a-4656-876e-a6b0cfe2a4df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.228696 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data" (OuterVolumeSpecName: "config-data") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.242479 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.242678 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d") on node "crc" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.247548 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.253574 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f8b8d93d-24db-4382-9077-7404605c7cf1" (UID: "f8b8d93d-24db-4382-9077-7404605c7cf1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.255228 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "341f00a5-410a-4656-876e-a6b0cfe2a4df" (UID: "341f00a5-410a-4656-876e-a6b0cfe2a4df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.257962 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "3ee97b1f-ce61-45ef-97e1-642cc13ef521" (UID: "3ee97b1f-ce61-45ef-97e1-642cc13ef521"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319643 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fdf5bdd-a5ac-4ec5-9a41-b051d291361d\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319678 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b8d93d-24db-4382-9077-7404605c7cf1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319690 4823 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319701 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319709 4823 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319720 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.319729 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341f00a5-410a-4656-876e-a6b0cfe2a4df-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.322726 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-server-conf" (OuterVolumeSpecName: "server-conf") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.323727 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.324183 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91f5097e-d643-4598-9d06-39f14f913291" (UID: "91f5097e-d643-4598-9d06-39f14f913291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.334723 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e511eaa-334a-4fe3-ab41-e66d4a53a931" (UID: "0e511eaa-334a-4fe3-ab41-e66d4a53a931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.337059 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cffdbd32-0155-4dd0-897d-9e406fd5e2ee" (UID: "cffdbd32-0155-4dd0-897d-9e406fd5e2ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.345966 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c90aab28-60fa-4cdc-a89a-bd041351015d" (UID: "c90aab28-60fa-4cdc-a89a-bd041351015d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.368696 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.376143 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "c90aab28-60fa-4cdc-a89a-bd041351015d" (UID: "c90aab28-60fa-4cdc-a89a-bd041351015d"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.377175 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" (UID: "3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.377756 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "3ee97b1f-ce61-45ef-97e1-642cc13ef521" (UID: "3ee97b1f-ce61-45ef-97e1-642cc13ef521"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.386374 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-server-conf" (OuterVolumeSpecName: "server-conf") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.396875 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7f35ecc1-21e4-461e-91d3-3da96745fed6" (UID: "7f35ecc1-21e4-461e-91d3-3da96745fed6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.421748 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") pod \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\" (UID: \"cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.421862 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-config-data" (OuterVolumeSpecName: "config-data") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422168 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422190 4823 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee97b1f-ce61-45ef-97e1-642cc13ef521-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422200 4823 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422208 4823 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422217 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e511eaa-334a-4fe3-ab41-e66d4a53a931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422225 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422233 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422241 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422254 4823 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422261 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422272 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c90aab28-60fa-4cdc-a89a-bd041351015d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422280 4823 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.422289 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91f5097e-d643-4598-9d06-39f14f913291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.429307 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44c54ba6-36e8-4608-ab54-965ab4bdcef2" (UID: "44c54ba6-36e8-4608-ab54-965ab4bdcef2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.430975 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data" (OuterVolumeSpecName: "config-data") pod "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" (UID: "0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.436258 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f35ecc1-21e4-461e-91d3-3da96745fed6" (UID: "7f35ecc1-21e4-461e-91d3-3da96745fed6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.439792 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47" (OuterVolumeSpecName: "persistence") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "pvc-43028e97-28b0-43cc-9122-0ff68d03ac47". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.453691 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" (UID: "cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.460097 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bf14ab2c-212b-406f-b102-2a4b8a7a29f5" (UID: "bf14ab2c-212b-406f-b102-2a4b8a7a29f5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.471754 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data" (OuterVolumeSpecName: "config-data") pod "cffdbd32-0155-4dd0-897d-9e406fd5e2ee" (UID: "cffdbd32-0155-4dd0-897d-9e406fd5e2ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.501579 4823 scope.go:117] "RemoveContainer" containerID="7ac81f3d4f6c3b985e28f223f6d2d8ddf14deedc2445d7c0e81bfa4724f713b5" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.502538 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.528437 4823 scope.go:117] "RemoveContainer" containerID="9021e9fbfdca4e8c3bc783918228e38655a4e6f41df2ee2221badc7c3cbd77ae" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.530937 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf14ab2c-212b-406f-b102-2a4b8a7a29f5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.530969 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cffdbd32-0155-4dd0-897d-9e406fd5e2ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.530984 4823 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.530995 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f35ecc1-21e4-461e-91d3-3da96745fed6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.531007 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44c54ba6-36e8-4608-ab54-965ab4bdcef2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.531018 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.531063 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") on node \"crc\" " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.538679 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_64445002-15b9-4ec6-8c95-7c2bd33e0ecd/ovn-northd/0.log" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.539208 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.568510 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4d28-account-delete-cb8qx"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.583825 4823 scope.go:117] "RemoveContainer" containerID="4e552140d312fcdfa52ae99bb54947c323a559bd5e4b943aed566e48f1890450" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.585108 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance4d28-account-delete-cb8qx"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.591814 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.592221 4823 scope.go:117] "RemoveContainer" containerID="721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.592689 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.605337 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.605621 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-43028e97-28b0-43cc-9122-0ff68d03ac47" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47") on node "crc" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.619276 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632237 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-rundir\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632321 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-scripts\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632367 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-public-tls-certs\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632427 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-combined-ca-bundle\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632464 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-fernet-keys\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632531 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-internal-tls-certs\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632548 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-scripts\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632568 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-config-data\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632609 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-northd-tls-certs\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632655 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-465rw\" (UniqueName: \"kubernetes.io/projected/e5b08afc-bfe3-4938-ac42-3781d1290201-kube-api-access-465rw\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632674 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-credential-keys\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632820 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-metrics-certs-tls-certs\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632844 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xhmz\" (UniqueName: \"kubernetes.io/projected/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-kube-api-access-8xhmz\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632862 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-combined-ca-bundle\") pod \"e5b08afc-bfe3-4938-ac42-3781d1290201\" (UID: \"e5b08afc-bfe3-4938-ac42-3781d1290201\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.632890 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-config\") pod \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\" (UID: \"64445002-15b9-4ec6-8c95-7c2bd33e0ecd\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.633281 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43028e97-28b0-43cc-9122-0ff68d03ac47\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.637367 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.638041 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-scripts" (OuterVolumeSpecName: "scripts") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.638184 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.638644 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-config" (OuterVolumeSpecName: "config") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.640244 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.643681 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-kube-api-access-8xhmz" (OuterVolumeSpecName: "kube-api-access-8xhmz") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "kube-api-access-8xhmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.644986 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-scripts" (OuterVolumeSpecName: "scripts") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.646555 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b08afc-bfe3-4938-ac42-3781d1290201-kube-api-access-465rw" (OuterVolumeSpecName: "kube-api-access-465rw") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "kube-api-access-465rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.658508 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.681443 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronc394-account-delete-lkcfh"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.694952 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronc394-account-delete-lkcfh"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.695487 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-config-data" (OuterVolumeSpecName: "config-data") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.696836 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.713342 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-865d4cf8d6-bwj5n"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.726158 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.733880 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fxxf\" (UniqueName: \"kubernetes.io/projected/34b62d72-52bc-4a7d-806c-52784476a695-kube-api-access-2fxxf\") pod \"34b62d72-52bc-4a7d-806c-52784476a695\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736098 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-combined-ca-bundle\") pod \"34b62d72-52bc-4a7d-806c-52784476a695\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736229 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-config-data\") pod \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736356 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-config-data\") pod \"34b62d72-52bc-4a7d-806c-52784476a695\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736384 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-public-tls-certs\") pod \"34b62d72-52bc-4a7d-806c-52784476a695\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736417 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtgv4\" (UniqueName: \"kubernetes.io/projected/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-kube-api-access-jtgv4\") pod \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736494 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-scripts\") pod \"34b62d72-52bc-4a7d-806c-52784476a695\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736540 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-internal-tls-certs\") pod \"34b62d72-52bc-4a7d-806c-52784476a695\" (UID: \"34b62d72-52bc-4a7d-806c-52784476a695\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.736576 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-combined-ca-bundle\") pod \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\" (UID: \"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7\") " Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737137 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737158 4823 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737170 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737183 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737194 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-465rw\" (UniqueName: \"kubernetes.io/projected/e5b08afc-bfe3-4938-ac42-3781d1290201-kube-api-access-465rw\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737206 4823 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737216 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xhmz\" (UniqueName: \"kubernetes.io/projected/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-kube-api-access-8xhmz\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737229 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737241 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737252 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.737262 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.747920 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-865d4cf8d6-bwj5n"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.751961 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b62d72-52bc-4a7d-806c-52784476a695-kube-api-access-2fxxf" (OuterVolumeSpecName: "kube-api-access-2fxxf") pod "34b62d72-52bc-4a7d-806c-52784476a695" (UID: "34b62d72-52bc-4a7d-806c-52784476a695"). InnerVolumeSpecName "kube-api-access-2fxxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.768294 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-fcf4dff7-84zz6"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.789189 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-scripts" (OuterVolumeSpecName: "scripts") pod "34b62d72-52bc-4a7d-806c-52784476a695" (UID: "34b62d72-52bc-4a7d-806c-52784476a695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.813831 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.818365 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-kube-api-access-jtgv4" (OuterVolumeSpecName: "kube-api-access-jtgv4") pod "a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" (UID: "a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7"). InnerVolumeSpecName "kube-api-access-jtgv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.822339 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b64c64d55-q7zxm" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.835392 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.844622 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fxxf\" (UniqueName: \"kubernetes.io/projected/34b62d72-52bc-4a7d-806c-52784476a695-kube-api-access-2fxxf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.844658 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.844669 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtgv4\" (UniqueName: \"kubernetes.io/projected/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-kube-api-access-jtgv4\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.844683 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.844722 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: E1216 09:11:55.844796 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:55 crc kubenswrapper[4823]: E1216 09:11:55.844848 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts podName:7835251f-9e66-445c-9581-0422195cdc2b nodeName:}" failed. No retries permitted until 2025-12-16 09:11:59.844832905 +0000 UTC m=+8198.333399028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts") pod "novacell0da3b-account-delete-2w9zh" (UID: "7835251f-9e66-445c-9581-0422195cdc2b") : configmap "openstack-scripts" not found Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.850847 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5b08afc-bfe3-4938-ac42-3781d1290201" (UID: "e5b08afc-bfe3-4938-ac42-3781d1290201"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.851176 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" (UID: "a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.856454 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-config-data" (OuterVolumeSpecName: "config-data") pod "a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" (UID: "a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.859080 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.867171 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "64445002-15b9-4ec6-8c95-7c2bd33e0ecd" (UID: "64445002-15b9-4ec6-8c95-7c2bd33e0ecd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.876086 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "34b62d72-52bc-4a7d-806c-52784476a695" (UID: "34b62d72-52bc-4a7d-806c-52784476a695"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.886790 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04351c7d-aa0c-480c-8aba-86825423a27f" path="/var/lib/kubelet/pods/04351c7d-aa0c-480c-8aba-86825423a27f/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.887744 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" path="/var/lib/kubelet/pods/2a40068b-87bc-4af6-862d-ad33696041b3/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.888632 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be6f063-aed2-4468-9cd3-f7f03bd28211" path="/var/lib/kubelet/pods/3be6f063-aed2-4468-9cd3-f7f03bd28211/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.889958 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_64445002-15b9-4ec6-8c95-7c2bd33e0ecd/ovn-northd/0.log" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.890021 4823 generic.go:334] "Generic (PLEG): container finished" podID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" exitCode=139 Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.890220 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.890720 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" path="/var/lib/kubelet/pods/4496b25e-2f39-453a-aa60-ffa74e9913c8/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.892641 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da7ae09-cc1d-4f42-b1be-7045236d12e9" path="/var/lib/kubelet/pods/4da7ae09-cc1d-4f42-b1be-7045236d12e9/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.893487 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5729be98-e3b4-42bd-92a6-913d63da1de3" path="/var/lib/kubelet/pods/5729be98-e3b4-42bd-92a6-913d63da1de3/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.895058 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60956cfa-c484-445d-af87-52713ccf4d09" path="/var/lib/kubelet/pods/60956cfa-c484-445d-af87-52713ccf4d09/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.896721 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" path="/var/lib/kubelet/pods/6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.897631 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f60cf52-47f0-4efd-8479-64bcc13848cf" path="/var/lib/kubelet/pods/6f60cf52-47f0-4efd-8479-64bcc13848cf/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.899112 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c19921-64a0-4b2b-ad81-bac464f2f54a" path="/var/lib/kubelet/pods/76c19921-64a0-4b2b-ad81-bac464f2f54a/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.900110 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" path="/var/lib/kubelet/pods/7a50033a-9a6e-42e3-ac23-de2a24654b0f/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.900931 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b75df4d-61d8-4913-bea9-018339e8e2a8" path="/var/lib/kubelet/pods/9b75df4d-61d8-4913-bea9-018339e8e2a8/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.903013 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" path="/var/lib/kubelet/pods/d06b91f8-1fcd-40fe-b712-0549d99258c6/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.914453 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9880fe3-977f-473b-84c9-2cb6f65d588d" path="/var/lib/kubelet/pods/d9880fe3-977f-473b-84c9-2cb6f65d588d/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.915175 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" path="/var/lib/kubelet/pods/f058bf18-c31d-4b48-a183-bb9ae9223fbe/volumes" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.936914 4823 generic.go:334] "Generic (PLEG): container finished" podID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" exitCode=0 Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.937088 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.942401 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "34b62d72-52bc-4a7d-806c-52784476a695" (UID: "34b62d72-52bc-4a7d-806c-52784476a695"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.943684 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.943689 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.943712 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heatf1cb-account-delete-2mdnl" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.943746 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d656d958d-tmzmp" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.946082 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b08afc-bfe3-4938-ac42-3781d1290201-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.946108 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.946121 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64445002-15b9-4ec6-8c95-7c2bd33e0ecd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.946135 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.946148 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.946160 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:55 crc kubenswrapper[4823]: E1216 09:11:55.947018 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:55 crc kubenswrapper[4823]: E1216 09:11:55.947087 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts podName:1422dc66-68e5-403d-9e01-657d83772587 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:59.947071076 +0000 UTC m=+8198.435637199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts") pod "barbican75d9-account-delete-x7xds" (UID: "1422dc66-68e5-403d-9e01-657d83772587") : configmap "openstack-scripts" not found Dec 16 09:11:55 crc kubenswrapper[4823]: E1216 09:11:55.947386 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:55 crc kubenswrapper[4823]: E1216 09:11:55.947417 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts podName:25a0f697-45ab-48cd-b4e2-d5e8bcd3b725 nodeName:}" failed. No retries permitted until 2025-12-16 09:11:59.947407886 +0000 UTC m=+8198.435974049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts") pod "novaapif251-account-delete-dhnjg" (UID: "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725") : configmap "openstack-scripts" not found Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.981214 4823 scope.go:117] "RemoveContainer" containerID="721ddb3d721e21d50c0be2952ef296f0188553dcfea31e2f3a2d25c394c2d3b6" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.985200 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34b62d72-52bc-4a7d-806c-52784476a695" (UID: "34b62d72-52bc-4a7d-806c-52784476a695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.985634 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-config-data" (OuterVolumeSpecName: "config-data") pod "34b62d72-52bc-4a7d-806c-52784476a695" (UID: "34b62d72-52bc-4a7d-806c-52784476a695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987801 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-fcf4dff7-84zz6"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987872 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-ddfd865c7-nhsh6"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987890 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-ddfd865c7-nhsh6"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987904 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987919 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987933 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b64c64d55-q7zxm" event={"ID":"e5b08afc-bfe3-4938-ac42-3781d1290201","Type":"ContainerDied","Data":"471946ead711d4cc0705475683dd476bb4290ee837b8b8eb8c47d83311087b06"} Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987962 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.987980 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"34b62d72-52bc-4a7d-806c-52784476a695","Type":"ContainerDied","Data":"1ae969fa531745e02c6a805f42f3e16e6ef6cb8548d55c4daca838d7619357b6"} Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.988004 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"64445002-15b9-4ec6-8c95-7c2bd33e0ecd","Type":"ContainerDied","Data":"d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523"} Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.988019 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"64445002-15b9-4ec6-8c95-7c2bd33e0ecd","Type":"ContainerDied","Data":"e71013427b8ec529c188d0f556be5db5037f0ea0074d98f08688944c40eeec32"} Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.988054 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7","Type":"ContainerDied","Data":"0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa"} Dec 16 09:11:55 crc kubenswrapper[4823]: I1216 09:11:55.988069 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7","Type":"ContainerDied","Data":"671fa1c6a0651c345a57242308c25f95f53cda61ca020b343e7c592be1c3a231"} Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.001107 4823 scope.go:117] "RemoveContainer" containerID="84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.004890 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.048796 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.049082 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b62d72-52bc-4a7d-806c-52784476a695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.114063 4823 scope.go:117] "RemoveContainer" containerID="a872bdca1a55985b613b2e9d1e9a92fa37fb0ac195eba280c9f758c50de98937" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.116216 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7454ff977b-h6fwh"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.139799 4823 scope.go:117] "RemoveContainer" containerID="721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.141350 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71\": container with ID starting with 721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71 not found: ID does not exist" containerID="721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.141397 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71"} err="failed to get container status \"721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71\": rpc error: code = NotFound desc = could not find container \"721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71\": container with ID starting with 721c47558f9af7bfce05a40ae981abe655f7282de605a46f281cb1818605ab71 not found: ID does not exist" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.141426 4823 scope.go:117] "RemoveContainer" containerID="84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.144425 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7454ff977b-h6fwh"] Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.145292 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46\": container with ID starting with 84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46 not found: ID does not exist" containerID="84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.145322 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46"} err="failed to get container status \"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46\": rpc error: code = NotFound desc = could not find container \"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46\": container with ID starting with 84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46 not found: ID does not exist" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.145338 4823 scope.go:117] "RemoveContainer" containerID="aeb46928562b9e0657f49ff73daa201ffbf7d9ddbfda61724d79baa281b28aab" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.147174 4823 scope.go:117] "RemoveContainer" containerID="44afd549f065376806e3735489d6257a4793e59063b189217a6eecd50e0f1af0" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.209808 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.211914 4823 scope.go:117] "RemoveContainer" containerID="651f28a2c721b5b4308bee72f9032a131e2c7f4a064121891960b81b54b65133" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.220418 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.231205 4823 scope.go:117] "RemoveContainer" containerID="4429a4325a93631f06371dd2afacb4eb2d4fb8581157516905df32e1ec8033bf" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.236512 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.260118 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.285120 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.294386 4823 scope.go:117] "RemoveContainer" containerID="4abb0fa0b08dc3eb1d6b6ce1c34e2a3bc2f4171afda40aa20a25497cce168b3b" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.298131 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.327176 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.343894 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.346058 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.350387 4823 scope.go:117] "RemoveContainer" containerID="5a71791d0d178cb3e2f0ca5f41f8f5be586775d58f1659933d5697c3e1b3e765" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.356446 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-8589448fc-qj569"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.373125 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-8589448fc-qj569"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.390721 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.402039 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.416228 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.429395 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.434455 4823 scope.go:117] "RemoveContainer" containerID="417dd7302ac11767038b48587c8e4bda9ac67055e65321822c980a8f887c974d" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.447242 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.453650 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts\") pod \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.453806 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxkgf\" (UniqueName: \"kubernetes.io/projected/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-kube-api-access-lxkgf\") pod \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\" (UID: \"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725\") " Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.455234 4823 scope.go:117] "RemoveContainer" containerID="8d06e3290f385b94ea1f27ba2ac87da8630c1786706af2c1115d30b9f2ec0dd7" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.457502 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" (UID: "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.457551 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.460063 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-kube-api-access-lxkgf" (OuterVolumeSpecName: "kube-api-access-lxkgf") pod "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" (UID: "25a0f697-45ab-48cd-b4e2-d5e8bcd3b725"). InnerVolumeSpecName "kube-api-access-lxkgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.465279 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heatf1cb-account-delete-2mdnl"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.471150 4823 scope.go:117] "RemoveContainer" containerID="25018dc3f33bf4fbcc5228605d9875ddf4805fa913e97070453b8841fe915d79" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.473622 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heatf1cb-account-delete-2mdnl"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.473695 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.479580 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.480412 4823 scope.go:117] "RemoveContainer" containerID="34e5dedd3f4eeec3f9dfff67c9bc1eb3a9095430a483155bc57be31d63b3460b" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.482218 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.495188 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.502689 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.521475 4823 scope.go:117] "RemoveContainer" containerID="77f7563a34f0a287066ce2a8f04f92118c66c8f3bccb6cfd97b6587b049219b8" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.529674 4823 scope.go:117] "RemoveContainer" containerID="69a1928930a4c71ac0b712b8fdbc30fc7cd0ea0a10daa4d5d941a2057c2caf2e" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.538589 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.556192 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d656d958d-tmzmp"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.556216 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts\") pod \"1422dc66-68e5-403d-9e01-657d83772587\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.556388 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22z6t\" (UniqueName: \"kubernetes.io/projected/7835251f-9e66-445c-9581-0422195cdc2b-kube-api-access-22z6t\") pod \"7835251f-9e66-445c-9581-0422195cdc2b\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.556566 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nps6\" (UniqueName: \"kubernetes.io/projected/1422dc66-68e5-403d-9e01-657d83772587-kube-api-access-2nps6\") pod \"1422dc66-68e5-403d-9e01-657d83772587\" (UID: \"1422dc66-68e5-403d-9e01-657d83772587\") " Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.556665 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts\") pod \"7835251f-9e66-445c-9581-0422195cdc2b\" (UID: \"7835251f-9e66-445c-9581-0422195cdc2b\") " Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.556866 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1422dc66-68e5-403d-9e01-657d83772587" (UID: "1422dc66-68e5-403d-9e01-657d83772587"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.558077 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.558109 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1422dc66-68e5-403d-9e01-657d83772587-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.558142 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxkgf\" (UniqueName: \"kubernetes.io/projected/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725-kube-api-access-lxkgf\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.558574 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7835251f-9e66-445c-9581-0422195cdc2b" (UID: "7835251f-9e66-445c-9581-0422195cdc2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.564902 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1422dc66-68e5-403d-9e01-657d83772587-kube-api-access-2nps6" (OuterVolumeSpecName: "kube-api-access-2nps6") pod "1422dc66-68e5-403d-9e01-657d83772587" (UID: "1422dc66-68e5-403d-9e01-657d83772587"). InnerVolumeSpecName "kube-api-access-2nps6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.578047 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7835251f-9e66-445c-9581-0422195cdc2b-kube-api-access-22z6t" (OuterVolumeSpecName: "kube-api-access-22z6t") pod "7835251f-9e66-445c-9581-0422195cdc2b" (UID: "7835251f-9e66-445c-9581-0422195cdc2b"). InnerVolumeSpecName "kube-api-access-22z6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.578364 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d656d958d-tmzmp"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.580296 4823 scope.go:117] "RemoveContainer" containerID="d0ed8363af4a48c1ad2fe42fbd2a98b00aaad8af5f9c4a1438b6a7c118165062" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.586825 4823 scope.go:117] "RemoveContainer" containerID="edfff901e422667feb8df9942487dc99f3781c67ee58c02cc9599524e02a462f" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.597002 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b64c64d55-q7zxm"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.598353 4823 scope.go:117] "RemoveContainer" containerID="d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.598825 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a\": container with ID starting with d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a not found: ID does not exist" containerID="d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.598874 4823 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a\": rpc error: code = NotFound desc = could not find container \"d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a\": container with ID starting with d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a not found: ID does not exist" containerID="d1a4fe707f62e7d60a2461aca9f820aa5ffe503be99824ae07a55f13abe97b6a" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.598898 4823 scope.go:117] "RemoveContainer" containerID="30fdfe487e62f583768c370c7e485f339f7b652c87130a0e97f5217998c63ec1" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.607210 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b64c64d55-q7zxm"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.609446 4823 scope.go:117] "RemoveContainer" containerID="c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.618373 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.627733 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.632321 4823 scope.go:117] "RemoveContainer" containerID="c4f509080a9f88ea8968e728a43f48daa0e137e74d01c40c44bdd031bebe8a40" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.646274 4823 scope.go:117] "RemoveContainer" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.654892 4823 scope.go:117] "RemoveContainer" containerID="e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.655280 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3\": container with ID starting with e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3 not found: ID does not exist" containerID="e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.655314 4823 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3\": rpc error: code = NotFound desc = could not find container \"e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3\": container with ID starting with e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3 not found: ID does not exist" containerID="e3714187f4fd4a54fbc4a0c088bda82d8215a431696595f42b4fd6b49fdb78e3" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.655331 4823 scope.go:117] "RemoveContainer" containerID="6153e939f78f8727294cd18b744eafe21fd2960278b20c36c46e082697f211e2" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.659331 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835251f-9e66-445c-9581-0422195cdc2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.659357 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22z6t\" (UniqueName: \"kubernetes.io/projected/7835251f-9e66-445c-9581-0422195cdc2b-kube-api-access-22z6t\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.659386 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nps6\" (UniqueName: \"kubernetes.io/projected/1422dc66-68e5-403d-9e01-657d83772587-kube-api-access-2nps6\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.662826 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.663078 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" containerName="adoption" containerID="cri-o://809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93" gracePeriod=30 Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.673054 4823 scope.go:117] "RemoveContainer" containerID="c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.673495 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca\": container with ID starting with c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca not found: ID does not exist" containerID="c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.673538 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca"} err="failed to get container status \"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca\": rpc error: code = NotFound desc = could not find container \"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca\": container with ID starting with c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca not found: ID does not exist" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.673560 4823 scope.go:117] "RemoveContainer" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.673924 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523\": container with ID starting with d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523 not found: ID does not exist" containerID="d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.673951 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523"} err="failed to get container status \"d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523\": rpc error: code = NotFound desc = could not find container \"d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523\": container with ID starting with d14f2961c04bfe412ae181946bae7fa89e83576b7b38570699ef5a396fa77523 not found: ID does not exist" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.673970 4823 scope.go:117] "RemoveContainer" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.731611 4823 scope.go:117] "RemoveContainer" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.732290 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa\": container with ID starting with 0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa not found: ID does not exist" containerID="0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.732401 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa"} err="failed to get container status \"0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa\": rpc error: code = NotFound desc = could not find container \"0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa\": container with ID starting with 0fc4d511ab4123720484006b298f65a27f5b85e3777d75783c5ffe138cedc8aa not found: ID does not exist" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.756770 4823 scope.go:117] "RemoveContainer" containerID="81905a10f112a2c628e83243c3b5f4a8905df8cf52465b30f43adc5b1087cc6c" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.791272 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.791728 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="b416f746-16ad-4f74-b315-f67ca3d0bb35" containerName="adoption" containerID="cri-o://01385a59cb3d569d4bca785b2dde8a6fb841f70317cca67d92c68009862c7ab6" gracePeriod=30 Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.799631 4823 scope.go:117] "RemoveContainer" containerID="8de49ec4ecd014f0c63b6ec26daddcc666881ad384dbde0da558498c271e033a" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.820776 4823 scope.go:117] "RemoveContainer" containerID="86be94bdbff4c07beea3917efb385bd5395bed9cd2e2647743ab02d6da764784" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.852638 4823 scope.go:117] "RemoveContainer" containerID="f09c828395889bafb8967722bc9fe10e42c34bbfc0a893f1c82cb91b42750c4b" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.880198 4823 scope.go:117] "RemoveContainer" containerID="84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.880973 4823 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46\": rpc error: code = NotFound desc = could not find container \"84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46\": container with ID starting with 84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46 not found: ID does not exist" containerID="84d302ca04075fd211e3074ac4840cb034044f62733fd5920a6f6817538aea46" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.881129 4823 scope.go:117] "RemoveContainer" containerID="6c09a5c99fb275c16db9a38bca9fa510e6e4a0d570871a34f72960096724361b" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.924533 4823 scope.go:117] "RemoveContainer" containerID="ba1b7898d20fd45107f404b20c4708776c2d7d02569bb8372b7f1258b422904d" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.949243 4823 scope.go:117] "RemoveContainer" containerID="c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" Dec 16 09:11:56 crc kubenswrapper[4823]: E1216 09:11:56.949685 4823 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca\": rpc error: code = NotFound desc = could not find container \"c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca\": container with ID starting with c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca not found: ID does not exist" containerID="c7585de23c8702a670fdde1b698b632bfc5040ed1281eb2e9f42a9174e0f40ca" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.949748 4823 scope.go:117] "RemoveContainer" containerID="8ad69e3090748c1e799f5496828a84c316a66aa70069a402b82eec113ca2e82a" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.959089 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif251-account-delete-dhnjg" event={"ID":"25a0f697-45ab-48cd-b4e2-d5e8bcd3b725","Type":"ContainerDied","Data":"c69cb34cb1b02d5fa8a3c6c2a83bc20184104220c1e0d57d6f0b8c7b9d0cf1d9"} Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.959354 4823 scope.go:117] "RemoveContainer" containerID="9cf75a92220918199d53c40a114d1e917e97cd49891dc0fbb874a085d7c6fba4" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.959340 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif251-account-delete-dhnjg" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.969812 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican75d9-account-delete-x7xds" event={"ID":"1422dc66-68e5-403d-9e01-657d83772587","Type":"ContainerDied","Data":"a743c22038bc13e5b952d64ec76e34d2c84b9885987049e0b038a31e63335500"} Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.969937 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican75d9-account-delete-x7xds" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.979367 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0da3b-account-delete-2w9zh" event={"ID":"7835251f-9e66-445c-9581-0422195cdc2b","Type":"ContainerDied","Data":"f890c4c2314b798dc974513df5541e8952b1552ebd64dd80204df16fbde9f3c3"} Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.979497 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0da3b-account-delete-2w9zh" Dec 16 09:11:56 crc kubenswrapper[4823]: I1216 09:11:56.990275 4823 scope.go:117] "RemoveContainer" containerID="00c32270da7e5bc5abc76b1cc8b234f94d52f813f0d0567012796ddcc39edf32" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.037357 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican75d9-account-delete-x7xds"] Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.048069 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican75d9-account-delete-x7xds"] Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.064599 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0da3b-account-delete-2w9zh"] Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.068199 4823 scope.go:117] "RemoveContainer" containerID="7ad1800fb9f46bfb6f404756ddd7cf764c4bcb0ef59c32098fe6e1ad4f76221f" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.073221 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0da3b-account-delete-2w9zh"] Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.082670 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapif251-account-delete-dhnjg"] Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.088353 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapif251-account-delete-dhnjg"] Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.103822 4823 scope.go:117] "RemoveContainer" containerID="9a424663c04d2d1f4d56826ed6ce633b0f8821a510cac5fcd0653e0828e82b8e" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.144769 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.60:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.580795 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679275 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q46t\" (UniqueName: \"kubernetes.io/projected/91080e73-6479-4c8b-bb2f-decdc0ade67e-kube-api-access-9q46t\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679357 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-run-httpd\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679391 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-scripts\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679446 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-combined-ca-bundle\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679471 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-log-httpd\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679486 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-config-data\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.679524 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-sg-core-conf-yaml\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.680116 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.680157 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.680337 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-ceilometer-tls-certs\") pod \"91080e73-6479-4c8b-bb2f-decdc0ade67e\" (UID: \"91080e73-6479-4c8b-bb2f-decdc0ade67e\") " Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.680758 4823 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.680777 4823 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91080e73-6479-4c8b-bb2f-decdc0ade67e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.688262 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-scripts" (OuterVolumeSpecName: "scripts") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.716611 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91080e73-6479-4c8b-bb2f-decdc0ade67e-kube-api-access-9q46t" (OuterVolumeSpecName: "kube-api-access-9q46t") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "kube-api-access-9q46t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.741303 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.758161 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.100:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.758164 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.100:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.766076 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.782508 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.782534 4823 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.782542 4823 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.782551 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q46t\" (UniqueName: \"kubernetes.io/projected/91080e73-6479-4c8b-bb2f-decdc0ade67e-kube-api-access-9q46t\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.782622 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" path="/var/lib/kubelet/pods/0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.783482 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" path="/var/lib/kubelet/pods/0e511eaa-334a-4fe3-ab41-e66d4a53a931/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.784186 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1422dc66-68e5-403d-9e01-657d83772587" path="/var/lib/kubelet/pods/1422dc66-68e5-403d-9e01-657d83772587/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.785486 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" path="/var/lib/kubelet/pods/25a0f697-45ab-48cd-b4e2-d5e8bcd3b725/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.786102 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" path="/var/lib/kubelet/pods/341f00a5-410a-4656-876e-a6b0cfe2a4df/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.786723 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b62d72-52bc-4a7d-806c-52784476a695" path="/var/lib/kubelet/pods/34b62d72-52bc-4a7d-806c-52784476a695/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.787901 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" path="/var/lib/kubelet/pods/3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.788460 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee97b1f-ce61-45ef-97e1-642cc13ef521" path="/var/lib/kubelet/pods/3ee97b1f-ce61-45ef-97e1-642cc13ef521/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.789478 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" path="/var/lib/kubelet/pods/44c54ba6-36e8-4608-ab54-965ab4bdcef2/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.790077 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" path="/var/lib/kubelet/pods/64445002-15b9-4ec6-8c95-7c2bd33e0ecd/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.790800 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7835251f-9e66-445c-9581-0422195cdc2b" path="/var/lib/kubelet/pods/7835251f-9e66-445c-9581-0422195cdc2b/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.791724 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" path="/var/lib/kubelet/pods/7f35ecc1-21e4-461e-91d3-3da96745fed6/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.792222 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f5097e-d643-4598-9d06-39f14f913291" path="/var/lib/kubelet/pods/91f5097e-d643-4598-9d06-39f14f913291/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.792854 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" path="/var/lib/kubelet/pods/9fd92bc3-eaf0-4217-bcac-dd8f41db9edf/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.793759 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" path="/var/lib/kubelet/pods/a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.794428 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" path="/var/lib/kubelet/pods/bf14ab2c-212b-406f-b102-2a4b8a7a29f5/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.794972 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90aab28-60fa-4cdc-a89a-bd041351015d" path="/var/lib/kubelet/pods/c90aab28-60fa-4cdc-a89a-bd041351015d/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.796364 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.796827 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" path="/var/lib/kubelet/pods/cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.797379 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" path="/var/lib/kubelet/pods/cffdbd32-0155-4dd0-897d-9e406fd5e2ee/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.798482 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6361c12-5d54-4919-aafe-4ac9b88e8c20" path="/var/lib/kubelet/pods/d6361c12-5d54-4919-aafe-4ac9b88e8c20/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.798934 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b08afc-bfe3-4938-ac42-3781d1290201" path="/var/lib/kubelet/pods/e5b08afc-bfe3-4938-ac42-3781d1290201/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.799443 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" path="/var/lib/kubelet/pods/f8b8d93d-24db-4382-9077-7404605c7cf1/volumes" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.833969 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-config-data" (OuterVolumeSpecName: "config-data") pod "91080e73-6479-4c8b-bb2f-decdc0ade67e" (UID: "91080e73-6479-4c8b-bb2f-decdc0ade67e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.884035 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:57 crc kubenswrapper[4823]: I1216 09:11:57.884071 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91080e73-6479-4c8b-bb2f-decdc0ade67e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.023924 4823 generic.go:334] "Generic (PLEG): container finished" podID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerID="79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66" exitCode=0 Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.023977 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.024008 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerDied","Data":"79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66"} Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.024142 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91080e73-6479-4c8b-bb2f-decdc0ade67e","Type":"ContainerDied","Data":"380e1a79f4dcc3d646b1cd42e7195b74422fd4276d78617d7f6a8702db85cfa1"} Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.024179 4823 scope.go:117] "RemoveContainer" containerID="dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.049249 4823 scope.go:117] "RemoveContainer" containerID="d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.064321 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.070445 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.075428 4823 scope.go:117] "RemoveContainer" containerID="79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.096668 4823 scope.go:117] "RemoveContainer" containerID="18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.116980 4823 scope.go:117] "RemoveContainer" containerID="dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345" Dec 16 09:11:58 crc kubenswrapper[4823]: E1216 09:11:58.117737 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345\": container with ID starting with dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345 not found: ID does not exist" containerID="dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.117774 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345"} err="failed to get container status \"dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345\": rpc error: code = NotFound desc = could not find container \"dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345\": container with ID starting with dda05eb58431d1ff8385982543bd81d7679e91490c736039436d4fcfce053345 not found: ID does not exist" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.117899 4823 scope.go:117] "RemoveContainer" containerID="d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3" Dec 16 09:11:58 crc kubenswrapper[4823]: E1216 09:11:58.118358 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3\": container with ID starting with d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3 not found: ID does not exist" containerID="d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.118385 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3"} err="failed to get container status \"d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3\": rpc error: code = NotFound desc = could not find container \"d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3\": container with ID starting with d6a5d1a1c5e1f237851cc347fa709cbcd9a412a8a2d03acbfbedbaeae74c63c3 not found: ID does not exist" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.118400 4823 scope.go:117] "RemoveContainer" containerID="79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66" Dec 16 09:11:58 crc kubenswrapper[4823]: E1216 09:11:58.118661 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66\": container with ID starting with 79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66 not found: ID does not exist" containerID="79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.118683 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66"} err="failed to get container status \"79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66\": rpc error: code = NotFound desc = could not find container \"79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66\": container with ID starting with 79866c53b9104a855a31ec7d65de5986af187b4f14c67e1c06cf3e5d3a928a66 not found: ID does not exist" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.118695 4823 scope.go:117] "RemoveContainer" containerID="18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807" Dec 16 09:11:58 crc kubenswrapper[4823]: E1216 09:11:58.119075 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807\": container with ID starting with 18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807 not found: ID does not exist" containerID="18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.119096 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807"} err="failed to get container status \"18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807\": rpc error: code = NotFound desc = could not find container \"18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807\": container with ID starting with 18f3ebcfecc58f7fe77b00fc4dfa8cfa84e702063a21d50902de680197bc2807 not found: ID does not exist" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.134187 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.134254 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:11:58 crc kubenswrapper[4823]: E1216 09:11:58.800925 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:11:58 crc kubenswrapper[4823]: E1216 09:11:58.801281 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:12:06.801265524 +0000 UTC m=+8205.289831647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.989594 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d656d958d-tmzmp" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.40:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:58 crc kubenswrapper[4823]: I1216 09:11:58.989638 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d656d958d-tmzmp" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.40:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:11:59 crc kubenswrapper[4823]: I1216 09:11:59.005281 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="c90aab28-60fa-4cdc-a89a-bd041351015d" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.249:11211: i/o timeout" Dec 16 09:11:59 crc kubenswrapper[4823]: I1216 09:11:59.781747 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" path="/var/lib/kubelet/pods/91080e73-6479-4c8b-bb2f-decdc0ade67e/volumes" Dec 16 09:12:00 crc kubenswrapper[4823]: E1216 09:12:00.534489 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:00 crc kubenswrapper[4823]: E1216 09:12:00.535801 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:00 crc kubenswrapper[4823]: E1216 09:12:00.537207 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:00 crc kubenswrapper[4823]: E1216 09:12:00.537243 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:12:00 crc kubenswrapper[4823]: I1216 09:12:00.684988 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5948ddcb4-f5qgv" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.111:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8443: connect: connection refused" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.079636 4823 generic.go:334] "Generic (PLEG): container finished" podID="83abe53b-780a-4255-b2a8-22f3480c9358" containerID="bc5f650dbf19a065a416224d2c46c8451ed1939c757afbeb47b34a826f25043b" exitCode=0 Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.080288 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffc876c99-shbwd" event={"ID":"83abe53b-780a-4255-b2a8-22f3480c9358","Type":"ContainerDied","Data":"bc5f650dbf19a065a416224d2c46c8451ed1939c757afbeb47b34a826f25043b"} Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.177813 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.255763 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-combined-ca-bundle\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.255826 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-httpd-config\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.255892 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-internal-tls-certs\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.255978 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-config\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.256053 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz52k\" (UniqueName: \"kubernetes.io/projected/83abe53b-780a-4255-b2a8-22f3480c9358-kube-api-access-xz52k\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.256109 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-public-tls-certs\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.256214 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-ovndb-tls-certs\") pod \"83abe53b-780a-4255-b2a8-22f3480c9358\" (UID: \"83abe53b-780a-4255-b2a8-22f3480c9358\") " Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.262765 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83abe53b-780a-4255-b2a8-22f3480c9358-kube-api-access-xz52k" (OuterVolumeSpecName: "kube-api-access-xz52k") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "kube-api-access-xz52k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.262829 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.301106 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.302149 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-config" (OuterVolumeSpecName: "config") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.304853 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.318404 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.322218 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "83abe53b-780a-4255-b2a8-22f3480c9358" (UID: "83abe53b-780a-4255-b2a8-22f3480c9358"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.357646 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.358016 4823 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.358040 4823 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.358049 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.358059 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz52k\" (UniqueName: \"kubernetes.io/projected/83abe53b-780a-4255-b2a8-22f3480c9358-kube-api-access-xz52k\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.358071 4823 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.358079 4823 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83abe53b-780a-4255-b2a8-22f3480c9358-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.621139 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-ddfd865c7-nhsh6" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.121:8004/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:12:02 crc kubenswrapper[4823]: I1216 09:12:02.650211 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-8589448fc-qj569" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.122:8000/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.091907 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffc876c99-shbwd" event={"ID":"83abe53b-780a-4255-b2a8-22f3480c9358","Type":"ContainerDied","Data":"fa22742fe41b55d6771e69ed637d578d45d778b96a01e75db53d42f9c43ea22d"} Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.091963 4823 scope.go:117] "RemoveContainer" containerID="be736c54eb1998c9f331d4dd1c7970f56f4f491d0243de972bd6f9e630a78177" Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.092128 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffc876c99-shbwd" Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.116460 4823 scope.go:117] "RemoveContainer" containerID="bc5f650dbf19a065a416224d2c46c8451ed1939c757afbeb47b34a826f25043b" Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.133448 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ffc876c99-shbwd"] Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.139765 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6ffc876c99-shbwd"] Dec 16 09:12:03 crc kubenswrapper[4823]: I1216 09:12:03.782811 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" path="/var/lib/kubelet/pods/83abe53b-780a-4255-b2a8-22f3480c9358/volumes" Dec 16 09:12:06 crc kubenswrapper[4823]: E1216 09:12:06.835236 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:12:06 crc kubenswrapper[4823]: E1216 09:12:06.835491 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:12:22.835474524 +0000 UTC m=+8221.324040647 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:12:10 crc kubenswrapper[4823]: E1216 09:12:10.533290 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:10 crc kubenswrapper[4823]: E1216 09:12:10.535584 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:10 crc kubenswrapper[4823]: E1216 09:12:10.538097 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:10 crc kubenswrapper[4823]: E1216 09:12:10.538236 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:12:10 crc kubenswrapper[4823]: I1216 09:12:10.685669 4823 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5948ddcb4-f5qgv" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.111:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8443: connect: connection refused" Dec 16 09:12:10 crc kubenswrapper[4823]: I1216 09:12:10.685790 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.627150 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678254 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-secret-key\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678330 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-combined-ca-bundle\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678376 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-config-data\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678421 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d650b48-8848-4495-9b48-fdf7472cc19e-logs\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678552 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-tls-certs\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678637 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlrc2\" (UniqueName: \"kubernetes.io/projected/6d650b48-8848-4495-9b48-fdf7472cc19e-kube-api-access-zlrc2\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.678705 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-scripts\") pod \"6d650b48-8848-4495-9b48-fdf7472cc19e\" (UID: \"6d650b48-8848-4495-9b48-fdf7472cc19e\") " Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.681585 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d650b48-8848-4495-9b48-fdf7472cc19e-logs" (OuterVolumeSpecName: "logs") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.685409 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.690984 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d650b48-8848-4495-9b48-fdf7472cc19e-kube-api-access-zlrc2" (OuterVolumeSpecName: "kube-api-access-zlrc2") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "kube-api-access-zlrc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.703817 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-scripts" (OuterVolumeSpecName: "scripts") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.712600 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-config-data" (OuterVolumeSpecName: "config-data") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.715878 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.736581 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "6d650b48-8848-4495-9b48-fdf7472cc19e" (UID: "6d650b48-8848-4495-9b48-fdf7472cc19e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781697 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlrc2\" (UniqueName: \"kubernetes.io/projected/6d650b48-8848-4495-9b48-fdf7472cc19e-kube-api-access-zlrc2\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781726 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781735 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781743 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781751 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d650b48-8848-4495-9b48-fdf7472cc19e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781759 4823 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d650b48-8848-4495-9b48-fdf7472cc19e-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:15 crc kubenswrapper[4823]: I1216 09:12:15.781769 4823 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d650b48-8848-4495-9b48-fdf7472cc19e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.236339 4823 generic.go:334] "Generic (PLEG): container finished" podID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerID="946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52" exitCode=137 Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.236503 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5948ddcb4-f5qgv" event={"ID":"6d650b48-8848-4495-9b48-fdf7472cc19e","Type":"ContainerDied","Data":"946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52"} Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.236569 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5948ddcb4-f5qgv" event={"ID":"6d650b48-8848-4495-9b48-fdf7472cc19e","Type":"ContainerDied","Data":"036926b2eef0c1d25968fb54eb6d3a5e3bcd7cd8a742b903f4e70af78fcfec2f"} Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.236595 4823 scope.go:117] "RemoveContainer" containerID="609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.236787 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5948ddcb4-f5qgv" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.263226 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5948ddcb4-f5qgv"] Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.269481 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5948ddcb4-f5qgv"] Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.446722 4823 scope.go:117] "RemoveContainer" containerID="946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.463108 4823 scope.go:117] "RemoveContainer" containerID="609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c" Dec 16 09:12:16 crc kubenswrapper[4823]: E1216 09:12:16.463659 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c\": container with ID starting with 609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c not found: ID does not exist" containerID="609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.463738 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c"} err="failed to get container status \"609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c\": rpc error: code = NotFound desc = could not find container \"609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c\": container with ID starting with 609b29305d1b8337e75912b2d68079a181e3de1bec30ac81db18f27ffacc478c not found: ID does not exist" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.463773 4823 scope.go:117] "RemoveContainer" containerID="946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52" Dec 16 09:12:16 crc kubenswrapper[4823]: E1216 09:12:16.464332 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52\": container with ID starting with 946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52 not found: ID does not exist" containerID="946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52" Dec 16 09:12:16 crc kubenswrapper[4823]: I1216 09:12:16.464422 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52"} err="failed to get container status \"946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52\": rpc error: code = NotFound desc = could not find container \"946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52\": container with ID starting with 946406cfb56bff7c3a03092d515e629d97b3a1863e6c9e9d93aa6db0d70bbf52 not found: ID does not exist" Dec 16 09:12:17 crc kubenswrapper[4823]: I1216 09:12:17.780772 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" path="/var/lib/kubelet/pods/6d650b48-8848-4495-9b48-fdf7472cc19e/volumes" Dec 16 09:12:20 crc kubenswrapper[4823]: E1216 09:12:20.534542 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:20 crc kubenswrapper[4823]: E1216 09:12:20.536749 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:20 crc kubenswrapper[4823]: E1216 09:12:20.539404 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:20 crc kubenswrapper[4823]: E1216 09:12:20.539461 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:12:22 crc kubenswrapper[4823]: E1216 09:12:22.838046 4823 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 09:12:22 crc kubenswrapper[4823]: E1216 09:12:22.838505 4823 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts podName:6ecabaef-9422-4e5c-bf83-df3b523b8fa7 nodeName:}" failed. No retries permitted until 2025-12-16 09:12:54.838475457 +0000 UTC m=+8253.327041620 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts") pod "aodhf38e-account-delete-hxrkv" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7") : configmap "openstack-scripts" not found Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.299856 4823 generic.go:334] "Generic (PLEG): container finished" podID="6ecabaef-9422-4e5c-bf83-df3b523b8fa7" containerID="31913fa281143c1606422328777000ca5e5453f2293c31a874dc34f60925d2e3" exitCode=137 Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.299944 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodhf38e-account-delete-hxrkv" event={"ID":"6ecabaef-9422-4e5c-bf83-df3b523b8fa7","Type":"ContainerDied","Data":"31913fa281143c1606422328777000ca5e5453f2293c31a874dc34f60925d2e3"} Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.300265 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodhf38e-account-delete-hxrkv" event={"ID":"6ecabaef-9422-4e5c-bf83-df3b523b8fa7","Type":"ContainerDied","Data":"bc402249c7f1d2470a20c55e31d2f62c03e4b834ca222fda192967a13581d2f8"} Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.300290 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc402249c7f1d2470a20c55e31d2f62c03e4b834ca222fda192967a13581d2f8" Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.315716 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.350242 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts\") pod \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.350739 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d568s\" (UniqueName: \"kubernetes.io/projected/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-kube-api-access-d568s\") pod \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\" (UID: \"6ecabaef-9422-4e5c-bf83-df3b523b8fa7\") " Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.350827 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ecabaef-9422-4e5c-bf83-df3b523b8fa7" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.351467 4823 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.356515 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-kube-api-access-d568s" (OuterVolumeSpecName: "kube-api-access-d568s") pod "6ecabaef-9422-4e5c-bf83-df3b523b8fa7" (UID: "6ecabaef-9422-4e5c-bf83-df3b523b8fa7"). InnerVolumeSpecName "kube-api-access-d568s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:23 crc kubenswrapper[4823]: I1216 09:12:23.452547 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d568s\" (UniqueName: \"kubernetes.io/projected/6ecabaef-9422-4e5c-bf83-df3b523b8fa7-kube-api-access-d568s\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:24 crc kubenswrapper[4823]: I1216 09:12:24.308608 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodhf38e-account-delete-hxrkv" Dec 16 09:12:24 crc kubenswrapper[4823]: I1216 09:12:24.326834 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodhf38e-account-delete-hxrkv"] Dec 16 09:12:24 crc kubenswrapper[4823]: I1216 09:12:24.333279 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodhf38e-account-delete-hxrkv"] Dec 16 09:12:25 crc kubenswrapper[4823]: I1216 09:12:25.784634 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecabaef-9422-4e5c-bf83-df3b523b8fa7" path="/var/lib/kubelet/pods/6ecabaef-9422-4e5c-bf83-df3b523b8fa7/volumes" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.135977 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.136412 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.325436 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.366175 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") pod \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.366287 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc8f2\" (UniqueName: \"kubernetes.io/projected/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7-kube-api-access-vc8f2\") pod \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\" (UID: \"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7\") " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.384348 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7-kube-api-access-vc8f2" (OuterVolumeSpecName: "kube-api-access-vc8f2") pod "a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" (UID: "a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7"). InnerVolumeSpecName "kube-api-access-vc8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.407289 4823 generic.go:334] "Generic (PLEG): container finished" podID="a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" containerID="809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93" exitCode=137 Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.407405 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7","Type":"ContainerDied","Data":"809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93"} Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.407438 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7","Type":"ContainerDied","Data":"380427654f4fd9ca64240b353ce082dd27e8be7a481d76582b175c586f4c6e77"} Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.407435 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.407457 4823 scope.go:117] "RemoveContainer" containerID="809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.431191 4823 generic.go:334] "Generic (PLEG): container finished" podID="b416f746-16ad-4f74-b315-f67ca3d0bb35" containerID="01385a59cb3d569d4bca785b2dde8a6fb841f70317cca67d92c68009862c7ab6" exitCode=137 Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.431236 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"b416f746-16ad-4f74-b315-f67ca3d0bb35","Type":"ContainerDied","Data":"01385a59cb3d569d4bca785b2dde8a6fb841f70317cca67d92c68009862c7ab6"} Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.431263 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"b416f746-16ad-4f74-b315-f67ca3d0bb35","Type":"ContainerDied","Data":"c61d96840ac48c4b80c9fe7848ce298b20b61c12c4aaf1281dc79fd3f2c264f0"} Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.431277 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c61d96840ac48c4b80c9fe7848ce298b20b61c12c4aaf1281dc79fd3f2c264f0" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.433419 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.448934 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c" (OuterVolumeSpecName: "mariadb-data") pod "a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" (UID: "a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7"). InnerVolumeSpecName "pvc-13347050-82ad-45f6-9369-708eb586bc9c". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.461343 4823 scope.go:117] "RemoveContainer" containerID="809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93" Dec 16 09:12:28 crc kubenswrapper[4823]: E1216 09:12:28.461866 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93\": container with ID starting with 809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93 not found: ID does not exist" containerID="809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.461907 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93"} err="failed to get container status \"809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93\": rpc error: code = NotFound desc = could not find container \"809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93\": container with ID starting with 809020e57886a330ac0588ed75cd6801305a87d81cbb57b685c04b626655fd93 not found: ID does not exist" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.468172 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b416f746-16ad-4f74-b315-f67ca3d0bb35-ovn-data-cert\") pod \"b416f746-16ad-4f74-b315-f67ca3d0bb35\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.468232 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvbpg\" (UniqueName: \"kubernetes.io/projected/b416f746-16ad-4f74-b315-f67ca3d0bb35-kube-api-access-fvbpg\") pod \"b416f746-16ad-4f74-b315-f67ca3d0bb35\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.468651 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") pod \"b416f746-16ad-4f74-b315-f67ca3d0bb35\" (UID: \"b416f746-16ad-4f74-b315-f67ca3d0bb35\") " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.469051 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc8f2\" (UniqueName: \"kubernetes.io/projected/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7-kube-api-access-vc8f2\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.469091 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-13347050-82ad-45f6-9369-708eb586bc9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") on node \"crc\" " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.471681 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b416f746-16ad-4f74-b315-f67ca3d0bb35-kube-api-access-fvbpg" (OuterVolumeSpecName: "kube-api-access-fvbpg") pod "b416f746-16ad-4f74-b315-f67ca3d0bb35" (UID: "b416f746-16ad-4f74-b315-f67ca3d0bb35"). InnerVolumeSpecName "kube-api-access-fvbpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.475340 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b416f746-16ad-4f74-b315-f67ca3d0bb35-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "b416f746-16ad-4f74-b315-f67ca3d0bb35" (UID: "b416f746-16ad-4f74-b315-f67ca3d0bb35"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.481492 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb" (OuterVolumeSpecName: "ovn-data") pod "b416f746-16ad-4f74-b315-f67ca3d0bb35" (UID: "b416f746-16ad-4f74-b315-f67ca3d0bb35"). InnerVolumeSpecName "pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.502936 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.503132 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-13347050-82ad-45f6-9369-708eb586bc9c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c") on node "crc" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.570652 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") on node \"crc\" " Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.570722 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-13347050-82ad-45f6-9369-708eb586bc9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13347050-82ad-45f6-9369-708eb586bc9c\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.570771 4823 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b416f746-16ad-4f74-b315-f67ca3d0bb35-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.570799 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvbpg\" (UniqueName: \"kubernetes.io/projected/b416f746-16ad-4f74-b315-f67ca3d0bb35-kube-api-access-fvbpg\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.590669 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.590846 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb") on node "crc" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.672661 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-60d75f57-91ba-4826-b913-2e68eb6e0abb\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.746868 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 09:12:28 crc kubenswrapper[4823]: I1216 09:12:28.754793 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 09:12:29 crc kubenswrapper[4823]: I1216 09:12:29.443325 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 09:12:29 crc kubenswrapper[4823]: I1216 09:12:29.482217 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 09:12:29 crc kubenswrapper[4823]: I1216 09:12:29.487962 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 09:12:29 crc kubenswrapper[4823]: I1216 09:12:29.783909 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" path="/var/lib/kubelet/pods/a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7/volumes" Dec 16 09:12:29 crc kubenswrapper[4823]: I1216 09:12:29.784577 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b416f746-16ad-4f74-b315-f67ca3d0bb35" path="/var/lib/kubelet/pods/b416f746-16ad-4f74-b315-f67ca3d0bb35/volumes" Dec 16 09:12:30 crc kubenswrapper[4823]: E1216 09:12:30.534431 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:30 crc kubenswrapper[4823]: E1216 09:12:30.536262 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:30 crc kubenswrapper[4823]: E1216 09:12:30.538549 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:30 crc kubenswrapper[4823]: E1216 09:12:30.538624 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:12:40 crc kubenswrapper[4823]: E1216 09:12:40.533094 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:40 crc kubenswrapper[4823]: E1216 09:12:40.534774 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:40 crc kubenswrapper[4823]: E1216 09:12:40.536088 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 16 09:12:40 crc kubenswrapper[4823]: E1216 09:12:40.536127 4823 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64f85d9856-wwkd5" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.486203 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.615838 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data\") pod \"7a613891-fc01-4f69-97a8-63cccc00f4a5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.615914 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zspx\" (UniqueName: \"kubernetes.io/projected/7a613891-fc01-4f69-97a8-63cccc00f4a5-kube-api-access-6zspx\") pod \"7a613891-fc01-4f69-97a8-63cccc00f4a5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.615987 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-combined-ca-bundle\") pod \"7a613891-fc01-4f69-97a8-63cccc00f4a5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.616018 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data-custom\") pod \"7a613891-fc01-4f69-97a8-63cccc00f4a5\" (UID: \"7a613891-fc01-4f69-97a8-63cccc00f4a5\") " Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.621960 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a613891-fc01-4f69-97a8-63cccc00f4a5" (UID: "7a613891-fc01-4f69-97a8-63cccc00f4a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.622238 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a613891-fc01-4f69-97a8-63cccc00f4a5-kube-api-access-6zspx" (OuterVolumeSpecName: "kube-api-access-6zspx") pod "7a613891-fc01-4f69-97a8-63cccc00f4a5" (UID: "7a613891-fc01-4f69-97a8-63cccc00f4a5"). InnerVolumeSpecName "kube-api-access-6zspx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.626904 4823 generic.go:334] "Generic (PLEG): container finished" podID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" exitCode=137 Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.626967 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64f85d9856-wwkd5" event={"ID":"7a613891-fc01-4f69-97a8-63cccc00f4a5","Type":"ContainerDied","Data":"6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092"} Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.627009 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64f85d9856-wwkd5" event={"ID":"7a613891-fc01-4f69-97a8-63cccc00f4a5","Type":"ContainerDied","Data":"22af3c3c8c408d46a4187edafef634d4aaa3ad82741ae61f288fdc5e4fedee2b"} Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.626969 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64f85d9856-wwkd5" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.627061 4823 scope.go:117] "RemoveContainer" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.642644 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a613891-fc01-4f69-97a8-63cccc00f4a5" (UID: "7a613891-fc01-4f69-97a8-63cccc00f4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.675808 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data" (OuterVolumeSpecName: "config-data") pod "7a613891-fc01-4f69-97a8-63cccc00f4a5" (UID: "7a613891-fc01-4f69-97a8-63cccc00f4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.718861 4823 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.718920 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zspx\" (UniqueName: \"kubernetes.io/projected/7a613891-fc01-4f69-97a8-63cccc00f4a5-kube-api-access-6zspx\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.718941 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.718960 4823 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a613891-fc01-4f69-97a8-63cccc00f4a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.732845 4823 scope.go:117] "RemoveContainer" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" Dec 16 09:12:47 crc kubenswrapper[4823]: E1216 09:12:47.733281 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092\": container with ID starting with 6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092 not found: ID does not exist" containerID="6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.733323 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092"} err="failed to get container status \"6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092\": rpc error: code = NotFound desc = could not find container \"6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092\": container with ID starting with 6325c5677c7baf607bd7e00984420b3b254a2b5f3dd777e252669862730b0092 not found: ID does not exist" Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.954949 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-64f85d9856-wwkd5"] Dec 16 09:12:47 crc kubenswrapper[4823]: I1216 09:12:47.959648 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-64f85d9856-wwkd5"] Dec 16 09:12:49 crc kubenswrapper[4823]: I1216 09:12:49.783290 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" path="/var/lib/kubelet/pods/7a613891-fc01-4f69-97a8-63cccc00f4a5/volumes" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.787613 4823 scope.go:117] "RemoveContainer" containerID="cba74e4e324808c756477e9c3bf47e48dc1558ff0c47c4d8b8cd61d64d6ad973" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.815363 4823 scope.go:117] "RemoveContainer" containerID="d221cc2bc27f5d1770e4a4ef7820239cba5fdb3b7ce7ba7a7f1241ea613caf68" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.834648 4823 scope.go:117] "RemoveContainer" containerID="03aaea60579a32dbd22e959a4c109e38b799c758c8b0d9ef37082c0af8297906" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.892325 4823 scope.go:117] "RemoveContainer" containerID="9a306cbeecf35df7308d1553cc064c30c8abbe4a6a369ff751b3831a552d0f27" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.915647 4823 scope.go:117] "RemoveContainer" containerID="01385a59cb3d569d4bca785b2dde8a6fb841f70317cca67d92c68009862c7ab6" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.933810 4823 scope.go:117] "RemoveContainer" containerID="ae7cf328f2dddbc80841007ae8ef6edc83650ff4a0d553d7b2dca17acae597a1" Dec 16 09:12:57 crc kubenswrapper[4823]: I1216 09:12:57.951821 4823 scope.go:117] "RemoveContainer" containerID="43f2f25511680e01631c9fea0525d1784511e8f0fbc8bdc295206a3b91483591" Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.134379 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.134452 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.134515 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.135399 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f15e2940e276ab8f86caed82382b79b3b000ee860b1e320554c03e2678c0b4b5"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.135560 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://f15e2940e276ab8f86caed82382b79b3b000ee860b1e320554c03e2678c0b4b5" gracePeriod=600 Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.735380 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="f15e2940e276ab8f86caed82382b79b3b000ee860b1e320554c03e2678c0b4b5" exitCode=0 Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.735438 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"f15e2940e276ab8f86caed82382b79b3b000ee860b1e320554c03e2678c0b4b5"} Dec 16 09:12:58 crc kubenswrapper[4823]: I1216 09:12:58.735769 4823 scope.go:117] "RemoveContainer" containerID="14e51af7fb5c2d7b7fdc9e1989841225a65614d883db6f8d5aea8aeb819bd04d" Dec 16 09:12:59 crc kubenswrapper[4823]: I1216 09:12:59.752494 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerStarted","Data":"2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722"} Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.473537 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vlg4h/must-gather-fwxrv"] Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474518 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474534 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474558 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474566 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474582 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474589 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474601 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="thanos-sidecar" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474608 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="thanos-sidecar" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474618 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerName="setup-container" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474627 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerName="setup-container" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474638 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474646 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474660 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474667 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474680 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474687 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474701 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474708 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474719 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-server" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474726 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-server" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474742 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerName="rabbitmq" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474749 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerName="rabbitmq" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474762 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474770 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474784 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474791 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474802 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="prometheus" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474810 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="prometheus" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474822 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" containerName="adoption" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474829 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" containerName="adoption" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474842 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="probe" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474849 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="probe" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474860 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6361c12-5d54-4919-aafe-4ac9b88e8c20" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474867 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6361c12-5d54-4919-aafe-4ac9b88e8c20" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474876 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474884 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474918 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" containerName="init" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474925 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" containerName="init" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474934 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="mysql-bootstrap" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474942 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="mysql-bootstrap" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474950 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9880fe3-977f-473b-84c9-2cb6f65d588d" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474958 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9880fe3-977f-473b-84c9-2cb6f65d588d" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474969 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90aab28-60fa-4cdc-a89a-bd041351015d" containerName="memcached" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474977 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90aab28-60fa-4cdc-a89a-bd041351015d" containerName="memcached" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.474990 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.474997 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475006 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475012 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475022 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4da7ae09-cc1d-4f42-b1be-7045236d12e9" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475046 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da7ae09-cc1d-4f42-b1be-7045236d12e9" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475057 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="ovsdbserver-sb" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475066 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="ovsdbserver-sb" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475075 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475083 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475092 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1422dc66-68e5-403d-9e01-657d83772587" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475100 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1422dc66-68e5-403d-9e01-657d83772587" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475109 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-central-agent" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475116 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-central-agent" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475131 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerName="nova-cell1-conductor-conductor" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475138 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerName="nova-cell1-conductor-conductor" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475151 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475158 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475168 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475175 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475182 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="cinder-scheduler" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475188 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="cinder-scheduler" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475200 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475206 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475216 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b75df4d-61d8-4913-bea9-018339e8e2a8" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475222 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b75df4d-61d8-4913-bea9-018339e8e2a8" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475231 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b08afc-bfe3-4938-ac42-3781d1290201" containerName="keystone-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475238 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b08afc-bfe3-4938-ac42-3781d1290201" containerName="keystone-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475249 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475257 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475264 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerName="heat-cfnapi" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475271 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerName="heat-cfnapi" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475287 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475294 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475305 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="sg-core" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475312 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="sg-core" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475320 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5729be98-e3b4-42bd-92a6-913d63da1de3" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475327 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="5729be98-e3b4-42bd-92a6-913d63da1de3" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475340 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerName="setup-container" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475348 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerName="setup-container" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475357 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be6f063-aed2-4468-9cd3-f7f03bd28211" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475364 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be6f063-aed2-4468-9cd3-f7f03bd28211" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475377 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475384 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475399 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerName="nova-cell0-conductor-conductor" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475406 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerName="nova-cell0-conductor-conductor" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475420 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee97b1f-ce61-45ef-97e1-642cc13ef521" containerName="kube-state-metrics" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475428 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee97b1f-ce61-45ef-97e1-642cc13ef521" containerName="kube-state-metrics" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475437 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475444 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475460 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475468 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475476 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475485 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475497 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-notifier" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475504 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-notifier" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475516 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerName="rabbitmq" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475523 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerName="rabbitmq" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475538 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475547 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475556 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="init-config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475565 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="init-config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475573 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f60cf52-47f0-4efd-8479-64bcc13848cf" containerName="nova-scheduler-scheduler" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475583 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f60cf52-47f0-4efd-8479-64bcc13848cf" containerName="nova-scheduler-scheduler" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475597 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475604 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475614 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475621 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475629 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerName="heat-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475637 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerName="heat-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475651 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="alertmanager" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475659 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="alertmanager" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475668 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475675 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475686 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475693 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475703 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" containerName="dnsmasq-dns" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475711 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" containerName="dnsmasq-dns" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475721 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecabaef-9422-4e5c-bf83-df3b523b8fa7" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475728 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecabaef-9422-4e5c-bf83-df3b523b8fa7" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475739 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerName="mysql-bootstrap" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475747 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerName="mysql-bootstrap" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475759 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-evaluator" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475765 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-evaluator" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475777 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-listener" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475785 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-listener" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475797 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475804 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475817 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-notification-agent" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475826 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-notification-agent" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475836 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="ovsdbserver-nb" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475844 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="ovsdbserver-nb" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475852 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475860 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475872 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b416f746-16ad-4f74-b315-f67ca3d0bb35" containerName="adoption" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475878 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="b416f746-16ad-4f74-b315-f67ca3d0bb35" containerName="adoption" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475892 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475898 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475908 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="proxy-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475915 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="proxy-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475925 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475932 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475946 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7835251f-9e66-445c-9581-0422195cdc2b" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475952 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7835251f-9e66-445c-9581-0422195cdc2b" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475959 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475966 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475978 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.475984 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.475996 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476004 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.476015 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="init-config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476027 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="init-config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.476053 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476064 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.476077 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1422dc66-68e5-403d-9e01-657d83772587" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476084 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="1422dc66-68e5-403d-9e01-657d83772587" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.476098 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerName="galera" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476105 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerName="galera" Dec 16 09:13:53 crc kubenswrapper[4823]: E1216 09:13:53.476118 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7835251f-9e66-445c-9581-0422195cdc2b" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476125 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7835251f-9e66-445c-9581-0422195cdc2b" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476332 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476351 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1422dc66-68e5-403d-9e01-657d83772587" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476361 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476373 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="probe" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476385 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476392 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b8d93d-24db-4382-9077-7404605c7cf1" containerName="heat-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476405 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476412 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476423 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7835251f-9e66-445c-9581-0422195cdc2b" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476433 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d676c2b-8cf1-4933-8f2b-641733d096fc" containerName="proxy-server" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476440 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="868b7d1a-5039-4d72-9a41-d8e57b1df5d4" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476451 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="ovn-northd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476460 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476470 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ecabaef-9422-4e5c-bf83-df3b523b8fa7" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476478 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476490 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476498 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="sg-core" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476505 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-listener" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476513 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476519 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f60cf52-47f0-4efd-8479-64bcc13848cf" containerName="nova-scheduler-scheduler" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476529 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476536 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="config-reloader" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476543 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b75df4d-61d8-4913-bea9-018339e8e2a8" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476549 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a50033a-9a6e-42e3-ac23-de2a24654b0f" containerName="barbican-keystone-listener" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476556 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a5b801-4e68-48cb-a50c-c1f7cc5bf2e7" containerName="adoption" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476565 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f5097e-d643-4598-9d06-39f14f913291" containerName="cinder-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476575 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476583 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-evaluator" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476590 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffdbd32-0155-4dd0-897d-9e406fd5e2ee" containerName="cinder-scheduler" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476600 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd92bc3-eaf0-4217-bcac-dd8f41db9edf" containerName="nova-cell1-conductor-conductor" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476606 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9880fe3-977f-473b-84c9-2cb6f65d588d" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476618 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c01b8d9-eaf3-4415-a1b7-c53cbfaa707a" containerName="nova-metadata-metadata" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476628 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ce1c86-eccc-4f3c-b999-18774e823763" containerName="dnsmasq-dns" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476639 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f35ecc1-21e4-461e-91d3-3da96745fed6" containerName="heat-cfnapi" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476648 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf14ab2c-212b-406f-b102-2a4b8a7a29f5" containerName="rabbitmq" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476658 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="b416f746-16ad-4f74-b315-f67ca3d0bb35" containerName="adoption" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476668 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90aab28-60fa-4cdc-a89a-bd041351015d" containerName="memcached" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476680 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6361c12-5d54-4919-aafe-4ac9b88e8c20" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476689 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="ovsdbserver-sb" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476704 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476712 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-central-agent" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476720 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476731 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8f2d6b-a3bc-4503-84f7-5d9eb1db17ec" containerName="barbican-api-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476741 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a613891-fc01-4f69-97a8-63cccc00f4a5" containerName="heat-engine" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476751 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc7c64bb-0eae-4a5f-af09-ea948a6bd3f7" containerName="rabbitmq" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476758 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="64445002-15b9-4ec6-8c95-7c2bd33e0ecd" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476769 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="prometheus" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476775 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b62d72-52bc-4a7d-806c-52784476a695" containerName="aodh-notifier" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476789 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="5729be98-e3b4-42bd-92a6-913d63da1de3" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476801 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd65ea3-90ff-4a09-9ae8-3406de2d5ad8" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476814 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="83abe53b-780a-4255-b2a8-22f3480c9358" containerName="neutron-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476826 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a40068b-87bc-4af6-862d-ad33696041b3" containerName="nova-api-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476837 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bed8d9-6a2c-458b-a7d5-d6d28ac021e7" containerName="nova-cell0-conductor-conductor" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476846 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476861 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="proxy-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476875 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476885 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="60956cfa-c484-445d-af87-52713ccf4d09" containerName="glance-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476893 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="91080e73-6479-4c8b-bb2f-decdc0ade67e" containerName="ceilometer-notification-agent" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476904 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4da7ae09-cc1d-4f42-b1be-7045236d12e9" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476918 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e511eaa-334a-4fe3-ab41-e66d4a53a931" containerName="galera" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476928 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="openstack-network-exporter" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476943 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c54ba6-36e8-4608-ab54-965ab4bdcef2" containerName="placement-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476955 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="4496b25e-2f39-453a-aa60-ffa74e9913c8" containerName="galera" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476970 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d650b48-8848-4495-9b48-fdf7472cc19e" containerName="horizon" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476980 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be6f063-aed2-4468-9cd3-f7f03bd28211" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.476989 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06b91f8-1fcd-40fe-b712-0549d99258c6" containerName="glance-httpd" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477003 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b08afc-bfe3-4938-ac42-3781d1290201" containerName="keystone-api" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477016 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="341f00a5-410a-4656-876e-a6b0cfe2a4df" containerName="barbican-worker-log" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477046 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="f058bf18-c31d-4b48-a183-bb9ae9223fbe" containerName="thanos-sidecar" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477062 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebc4a0e-1b85-400b-bc10-5d216d7431fb" containerName="alertmanager" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477076 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dfc2e3-71af-4150-a4ca-02b5629083ae" containerName="ovsdbserver-nb" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477089 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee97b1f-ce61-45ef-97e1-642cc13ef521" containerName="kube-state-metrics" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477486 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="1422dc66-68e5-403d-9e01-657d83772587" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477504 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7835251f-9e66-445c-9581-0422195cdc2b" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.477516 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a0f697-45ab-48cd-b4e2-d5e8bcd3b725" containerName="mariadb-account-delete" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.478240 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.485000 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vlg4h"/"kube-root-ca.crt" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.485095 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vlg4h"/"openshift-service-ca.crt" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.485404 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vlg4h"/"default-dockercfg-cbwdm" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.493325 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vlg4h/must-gather-fwxrv"] Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.565822 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee145ea2-ab11-4150-b064-795d83f416f4-must-gather-output\") pod \"must-gather-fwxrv\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.565931 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tk9\" (UniqueName: \"kubernetes.io/projected/ee145ea2-ab11-4150-b064-795d83f416f4-kube-api-access-79tk9\") pod \"must-gather-fwxrv\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.667183 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee145ea2-ab11-4150-b064-795d83f416f4-must-gather-output\") pod \"must-gather-fwxrv\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.667326 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tk9\" (UniqueName: \"kubernetes.io/projected/ee145ea2-ab11-4150-b064-795d83f416f4-kube-api-access-79tk9\") pod \"must-gather-fwxrv\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.667994 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee145ea2-ab11-4150-b064-795d83f416f4-must-gather-output\") pod \"must-gather-fwxrv\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.690596 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tk9\" (UniqueName: \"kubernetes.io/projected/ee145ea2-ab11-4150-b064-795d83f416f4-kube-api-access-79tk9\") pod \"must-gather-fwxrv\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:53 crc kubenswrapper[4823]: I1216 09:13:53.811528 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:13:54 crc kubenswrapper[4823]: I1216 09:13:54.052376 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vlg4h/must-gather-fwxrv"] Dec 16 09:13:54 crc kubenswrapper[4823]: W1216 09:13:54.058348 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee145ea2_ab11_4150_b064_795d83f416f4.slice/crio-11fd32024f7bae7a15451bfd19f08534b7dfb9400dec059795a0548c1a334f29 WatchSource:0}: Error finding container 11fd32024f7bae7a15451bfd19f08534b7dfb9400dec059795a0548c1a334f29: Status 404 returned error can't find the container with id 11fd32024f7bae7a15451bfd19f08534b7dfb9400dec059795a0548c1a334f29 Dec 16 09:13:54 crc kubenswrapper[4823]: I1216 09:13:54.250090 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" event={"ID":"ee145ea2-ab11-4150-b064-795d83f416f4","Type":"ContainerStarted","Data":"11fd32024f7bae7a15451bfd19f08534b7dfb9400dec059795a0548c1a334f29"} Dec 16 09:13:58 crc kubenswrapper[4823]: I1216 09:13:58.629267 4823 scope.go:117] "RemoveContainer" containerID="6dcfd48a9401ea537616a16a619f9cf8397493228a632419e0b26b39624f0619" Dec 16 09:14:00 crc kubenswrapper[4823]: I1216 09:14:00.584328 4823 scope.go:117] "RemoveContainer" containerID="6841fcbf9dc6db729a33a5b8eec379954b362ffe6e98be98570e1c486a7e1947" Dec 16 09:14:00 crc kubenswrapper[4823]: I1216 09:14:00.644176 4823 scope.go:117] "RemoveContainer" containerID="f36a353c66bec2426a6543a7c9be7673d22af695566b4903267ba28b9ec1e76a" Dec 16 09:14:00 crc kubenswrapper[4823]: I1216 09:14:00.669539 4823 scope.go:117] "RemoveContainer" containerID="96fcd04f838b04b2f8450fa0fddd75849e66aca34b0de130673596b80ab9b02e" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.322291 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" event={"ID":"ee145ea2-ab11-4150-b064-795d83f416f4","Type":"ContainerStarted","Data":"5a3e99951950de22896958abf8332090de40f4e97d792a7902e4befcf3094dc0"} Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.322905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" event={"ID":"ee145ea2-ab11-4150-b064-795d83f416f4","Type":"ContainerStarted","Data":"b5fda3c4174361f11995c4c15e9f756265e6960dc5f08034b421e772e8164beb"} Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.339562 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" podStartSLOduration=1.730131743 podStartE2EDuration="8.33954314s" podCreationTimestamp="2025-12-16 09:13:53 +0000 UTC" firstStartedPulling="2025-12-16 09:13:54.060108682 +0000 UTC m=+8312.548674805" lastFinishedPulling="2025-12-16 09:14:00.669520069 +0000 UTC m=+8319.158086202" observedRunningTime="2025-12-16 09:14:01.338361482 +0000 UTC m=+8319.826927635" watchObservedRunningTime="2025-12-16 09:14:01.33954314 +0000 UTC m=+8319.828109263" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.449023 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vlg4h/crc-debug-kk9ln"] Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.450427 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.504693 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcfce70-e894-4d19-8c93-112a96c14c05-host\") pod \"crc-debug-kk9ln\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.505005 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tjc2\" (UniqueName: \"kubernetes.io/projected/cfcfce70-e894-4d19-8c93-112a96c14c05-kube-api-access-9tjc2\") pod \"crc-debug-kk9ln\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.607042 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcfce70-e894-4d19-8c93-112a96c14c05-host\") pod \"crc-debug-kk9ln\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.607515 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tjc2\" (UniqueName: \"kubernetes.io/projected/cfcfce70-e894-4d19-8c93-112a96c14c05-kube-api-access-9tjc2\") pod \"crc-debug-kk9ln\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.607151 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcfce70-e894-4d19-8c93-112a96c14c05-host\") pod \"crc-debug-kk9ln\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.640089 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tjc2\" (UniqueName: \"kubernetes.io/projected/cfcfce70-e894-4d19-8c93-112a96c14c05-kube-api-access-9tjc2\") pod \"crc-debug-kk9ln\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: I1216 09:14:01.809138 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:01 crc kubenswrapper[4823]: W1216 09:14:01.833918 4823 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfcfce70_e894_4d19_8c93_112a96c14c05.slice/crio-d4ab56f17d46ee5e0a3cd60c23eb6af58f7de30d1d5a4b6151f62b29a02d99b0 WatchSource:0}: Error finding container d4ab56f17d46ee5e0a3cd60c23eb6af58f7de30d1d5a4b6151f62b29a02d99b0: Status 404 returned error can't find the container with id d4ab56f17d46ee5e0a3cd60c23eb6af58f7de30d1d5a4b6151f62b29a02d99b0 Dec 16 09:14:02 crc kubenswrapper[4823]: I1216 09:14:02.331770 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" event={"ID":"cfcfce70-e894-4d19-8c93-112a96c14c05","Type":"ContainerStarted","Data":"d4ab56f17d46ee5e0a3cd60c23eb6af58f7de30d1d5a4b6151f62b29a02d99b0"} Dec 16 09:14:13 crc kubenswrapper[4823]: I1216 09:14:13.460807 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" event={"ID":"cfcfce70-e894-4d19-8c93-112a96c14c05","Type":"ContainerStarted","Data":"90a3999eaeab4b6ec6caa3bd3a2be96091e00d2c6ddf3179e431cd18d2a460ad"} Dec 16 09:14:13 crc kubenswrapper[4823]: I1216 09:14:13.480797 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" podStartSLOduration=1.492924788 podStartE2EDuration="12.480776029s" podCreationTimestamp="2025-12-16 09:14:01 +0000 UTC" firstStartedPulling="2025-12-16 09:14:01.835966097 +0000 UTC m=+8320.324532210" lastFinishedPulling="2025-12-16 09:14:12.823817318 +0000 UTC m=+8331.312383451" observedRunningTime="2025-12-16 09:14:13.476847926 +0000 UTC m=+8331.965414049" watchObservedRunningTime="2025-12-16 09:14:13.480776029 +0000 UTC m=+8331.969342152" Dec 16 09:14:39 crc kubenswrapper[4823]: I1216 09:14:39.666851 4823 generic.go:334] "Generic (PLEG): container finished" podID="cfcfce70-e894-4d19-8c93-112a96c14c05" containerID="90a3999eaeab4b6ec6caa3bd3a2be96091e00d2c6ddf3179e431cd18d2a460ad" exitCode=0 Dec 16 09:14:39 crc kubenswrapper[4823]: I1216 09:14:39.666954 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" event={"ID":"cfcfce70-e894-4d19-8c93-112a96c14c05","Type":"ContainerDied","Data":"90a3999eaeab4b6ec6caa3bd3a2be96091e00d2c6ddf3179e431cd18d2a460ad"} Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.764130 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.797581 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vlg4h/crc-debug-kk9ln"] Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.802762 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vlg4h/crc-debug-kk9ln"] Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.923652 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcfce70-e894-4d19-8c93-112a96c14c05-host\") pod \"cfcfce70-e894-4d19-8c93-112a96c14c05\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.924044 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tjc2\" (UniqueName: \"kubernetes.io/projected/cfcfce70-e894-4d19-8c93-112a96c14c05-kube-api-access-9tjc2\") pod \"cfcfce70-e894-4d19-8c93-112a96c14c05\" (UID: \"cfcfce70-e894-4d19-8c93-112a96c14c05\") " Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.923783 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfcfce70-e894-4d19-8c93-112a96c14c05-host" (OuterVolumeSpecName: "host") pod "cfcfce70-e894-4d19-8c93-112a96c14c05" (UID: "cfcfce70-e894-4d19-8c93-112a96c14c05"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.924357 4823 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcfce70-e894-4d19-8c93-112a96c14c05-host\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:40 crc kubenswrapper[4823]: I1216 09:14:40.932612 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcfce70-e894-4d19-8c93-112a96c14c05-kube-api-access-9tjc2" (OuterVolumeSpecName: "kube-api-access-9tjc2") pod "cfcfce70-e894-4d19-8c93-112a96c14c05" (UID: "cfcfce70-e894-4d19-8c93-112a96c14c05"). InnerVolumeSpecName "kube-api-access-9tjc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.025339 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tjc2\" (UniqueName: \"kubernetes.io/projected/cfcfce70-e894-4d19-8c93-112a96c14c05-kube-api-access-9tjc2\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.685557 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4ab56f17d46ee5e0a3cd60c23eb6af58f7de30d1d5a4b6151f62b29a02d99b0" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.685636 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-kk9ln" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.782258 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcfce70-e894-4d19-8c93-112a96c14c05" path="/var/lib/kubelet/pods/cfcfce70-e894-4d19-8c93-112a96c14c05/volumes" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.990062 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vlg4h/crc-debug-g8jg2"] Dec 16 09:14:41 crc kubenswrapper[4823]: E1216 09:14:41.990425 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcfce70-e894-4d19-8c93-112a96c14c05" containerName="container-00" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.990455 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcfce70-e894-4d19-8c93-112a96c14c05" containerName="container-00" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.990600 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcfce70-e894-4d19-8c93-112a96c14c05" containerName="container-00" Dec 16 09:14:41 crc kubenswrapper[4823]: I1216 09:14:41.991184 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.038502 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c165d71-dbd6-4aa1-941e-68a86cf8d672-host\") pod \"crc-debug-g8jg2\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.038590 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npjxr\" (UniqueName: \"kubernetes.io/projected/3c165d71-dbd6-4aa1-941e-68a86cf8d672-kube-api-access-npjxr\") pod \"crc-debug-g8jg2\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.139796 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c165d71-dbd6-4aa1-941e-68a86cf8d672-host\") pod \"crc-debug-g8jg2\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.139871 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npjxr\" (UniqueName: \"kubernetes.io/projected/3c165d71-dbd6-4aa1-941e-68a86cf8d672-kube-api-access-npjxr\") pod \"crc-debug-g8jg2\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.139920 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c165d71-dbd6-4aa1-941e-68a86cf8d672-host\") pod \"crc-debug-g8jg2\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.160312 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npjxr\" (UniqueName: \"kubernetes.io/projected/3c165d71-dbd6-4aa1-941e-68a86cf8d672-kube-api-access-npjxr\") pod \"crc-debug-g8jg2\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.316059 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.693630 4823 generic.go:334] "Generic (PLEG): container finished" podID="3c165d71-dbd6-4aa1-941e-68a86cf8d672" containerID="415cd00dfe13c0b99004e4f790581212e138e2f1563be7b7eceb11fb925087ac" exitCode=1 Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.693905 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" event={"ID":"3c165d71-dbd6-4aa1-941e-68a86cf8d672","Type":"ContainerDied","Data":"415cd00dfe13c0b99004e4f790581212e138e2f1563be7b7eceb11fb925087ac"} Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.693931 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" event={"ID":"3c165d71-dbd6-4aa1-941e-68a86cf8d672","Type":"ContainerStarted","Data":"2e07ed48bdd8eb2b76647ab44f6f0e50e546cf6f8c6229b51adeebdc9e52b582"} Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.726106 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vlg4h/crc-debug-g8jg2"] Dec 16 09:14:42 crc kubenswrapper[4823]: I1216 09:14:42.733045 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vlg4h/crc-debug-g8jg2"] Dec 16 09:14:43 crc kubenswrapper[4823]: I1216 09:14:43.802069 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:43 crc kubenswrapper[4823]: I1216 09:14:43.963591 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npjxr\" (UniqueName: \"kubernetes.io/projected/3c165d71-dbd6-4aa1-941e-68a86cf8d672-kube-api-access-npjxr\") pod \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " Dec 16 09:14:43 crc kubenswrapper[4823]: I1216 09:14:43.963771 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c165d71-dbd6-4aa1-941e-68a86cf8d672-host\") pod \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\" (UID: \"3c165d71-dbd6-4aa1-941e-68a86cf8d672\") " Dec 16 09:14:43 crc kubenswrapper[4823]: I1216 09:14:43.964041 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c165d71-dbd6-4aa1-941e-68a86cf8d672-host" (OuterVolumeSpecName: "host") pod "3c165d71-dbd6-4aa1-941e-68a86cf8d672" (UID: "3c165d71-dbd6-4aa1-941e-68a86cf8d672"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 09:14:43 crc kubenswrapper[4823]: I1216 09:14:43.975262 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c165d71-dbd6-4aa1-941e-68a86cf8d672-kube-api-access-npjxr" (OuterVolumeSpecName: "kube-api-access-npjxr") pod "3c165d71-dbd6-4aa1-941e-68a86cf8d672" (UID: "3c165d71-dbd6-4aa1-941e-68a86cf8d672"). InnerVolumeSpecName "kube-api-access-npjxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:14:44 crc kubenswrapper[4823]: I1216 09:14:44.065111 4823 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c165d71-dbd6-4aa1-941e-68a86cf8d672-host\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:44 crc kubenswrapper[4823]: I1216 09:14:44.065153 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npjxr\" (UniqueName: \"kubernetes.io/projected/3c165d71-dbd6-4aa1-941e-68a86cf8d672-kube-api-access-npjxr\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:44 crc kubenswrapper[4823]: I1216 09:14:44.710836 4823 scope.go:117] "RemoveContainer" containerID="415cd00dfe13c0b99004e4f790581212e138e2f1563be7b7eceb11fb925087ac" Dec 16 09:14:44 crc kubenswrapper[4823]: I1216 09:14:44.710941 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/crc-debug-g8jg2" Dec 16 09:14:45 crc kubenswrapper[4823]: I1216 09:14:45.780911 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c165d71-dbd6-4aa1-941e-68a86cf8d672" path="/var/lib/kubelet/pods/3c165d71-dbd6-4aa1-941e-68a86cf8d672/volumes" Dec 16 09:14:51 crc kubenswrapper[4823]: I1216 09:14:51.450320 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7e1d3682-8130-4fa4-aab4-ade2ac069d2e/openstack-network-exporter/0.log" Dec 16 09:14:51 crc kubenswrapper[4823]: I1216 09:14:51.652135 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7e1d3682-8130-4fa4-aab4-ade2ac069d2e/ovsdbserver-nb/0.log" Dec 16 09:14:51 crc kubenswrapper[4823]: I1216 09:14:51.756465 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_6353e69a-5c31-41c9-9d05-2b958aa6a79f/openstack-network-exporter/0.log" Dec 16 09:14:51 crc kubenswrapper[4823]: I1216 09:14:51.881773 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_6353e69a-5c31-41c9-9d05-2b958aa6a79f/ovsdbserver-nb/0.log" Dec 16 09:14:51 crc kubenswrapper[4823]: I1216 09:14:51.955776 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6c99b5e4-de24-426d-9a97-05fdcbe37141/openstack-network-exporter/0.log" Dec 16 09:14:52 crc kubenswrapper[4823]: I1216 09:14:52.079714 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6c99b5e4-de24-426d-9a97-05fdcbe37141/ovsdbserver-sb/0.log" Dec 16 09:14:52 crc kubenswrapper[4823]: I1216 09:14:52.194660 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_dc75b889-6dc5-462d-a589-50f705ffd78f/openstack-network-exporter/0.log" Dec 16 09:14:52 crc kubenswrapper[4823]: I1216 09:14:52.288271 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_dc75b889-6dc5-462d-a589-50f705ffd78f/ovsdbserver-sb/0.log" Dec 16 09:14:58 crc kubenswrapper[4823]: I1216 09:14:58.134090 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:14:58 crc kubenswrapper[4823]: I1216 09:14:58.134686 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.141917 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp"] Dec 16 09:15:00 crc kubenswrapper[4823]: E1216 09:15:00.142487 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c165d71-dbd6-4aa1-941e-68a86cf8d672" containerName="container-00" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.142499 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c165d71-dbd6-4aa1-941e-68a86cf8d672" containerName="container-00" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.142668 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c165d71-dbd6-4aa1-941e-68a86cf8d672" containerName="container-00" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.143179 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.146374 4823 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.146502 4823 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.151619 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp"] Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.203326 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a453e6-62fe-44b7-bcf8-7cebf1128f04-config-volume\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.203410 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81a453e6-62fe-44b7-bcf8-7cebf1128f04-secret-volume\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.203543 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhv2j\" (UniqueName: \"kubernetes.io/projected/81a453e6-62fe-44b7-bcf8-7cebf1128f04-kube-api-access-vhv2j\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.305221 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhv2j\" (UniqueName: \"kubernetes.io/projected/81a453e6-62fe-44b7-bcf8-7cebf1128f04-kube-api-access-vhv2j\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.305306 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a453e6-62fe-44b7-bcf8-7cebf1128f04-config-volume\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.305349 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81a453e6-62fe-44b7-bcf8-7cebf1128f04-secret-volume\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.306258 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a453e6-62fe-44b7-bcf8-7cebf1128f04-config-volume\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.317029 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81a453e6-62fe-44b7-bcf8-7cebf1128f04-secret-volume\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.331471 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhv2j\" (UniqueName: \"kubernetes.io/projected/81a453e6-62fe-44b7-bcf8-7cebf1128f04-kube-api-access-vhv2j\") pod \"collect-profiles-29431275-kjtrp\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.492440 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.751379 4823 scope.go:117] "RemoveContainer" containerID="bfe81cfc2db040563bbadf5d2c6d9c8824ce12477bde8907e686f8f1ec7a24d6" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.805897 4823 scope.go:117] "RemoveContainer" containerID="9c2fb0e25d5692eb7a90933e0f8cf60671619d23ad83cd29dd61e8449d4f5dfe" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.826453 4823 scope.go:117] "RemoveContainer" containerID="89a2670cf718ab31b266c6424bfc66b8bfcac5fd22e626dc8ba58e5549ee3781" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.842546 4823 scope.go:117] "RemoveContainer" containerID="be73fd31aef3c2646ca8ea8d7a1185806913357be9eccd871881753164e9eaff" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.862203 4823 scope.go:117] "RemoveContainer" containerID="b67bbe0ab1f251a9ee41d54b3f9217494a588f2c6e81c715f6504ef1a69b0fb0" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.880581 4823 scope.go:117] "RemoveContainer" containerID="60f7cef83d7054fd046b15fe4e391677f350fc91768e0d0ca253e20b4eb08a06" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.904537 4823 scope.go:117] "RemoveContainer" containerID="e7c4b08b0b98afba7de5e397e6bee5b5f06c847966a35dc613331d6dafcf3b4f" Dec 16 09:15:00 crc kubenswrapper[4823]: I1216 09:15:00.979600 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp"] Dec 16 09:15:01 crc kubenswrapper[4823]: I1216 09:15:01.851900 4823 generic.go:334] "Generic (PLEG): container finished" podID="81a453e6-62fe-44b7-bcf8-7cebf1128f04" containerID="2e223aca0f0f805497d54cae5141cc84917496def38f3054374922610060006a" exitCode=0 Dec 16 09:15:01 crc kubenswrapper[4823]: I1216 09:15:01.851945 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" event={"ID":"81a453e6-62fe-44b7-bcf8-7cebf1128f04","Type":"ContainerDied","Data":"2e223aca0f0f805497d54cae5141cc84917496def38f3054374922610060006a"} Dec 16 09:15:01 crc kubenswrapper[4823]: I1216 09:15:01.852202 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" event={"ID":"81a453e6-62fe-44b7-bcf8-7cebf1128f04","Type":"ContainerStarted","Data":"561af7b4c27407b7381b5d309f8b54d76782227b292f13aaaa20a940d3f0cb47"} Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.241083 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.265189 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81a453e6-62fe-44b7-bcf8-7cebf1128f04-secret-volume\") pod \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.265336 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a453e6-62fe-44b7-bcf8-7cebf1128f04-config-volume\") pod \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.265404 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhv2j\" (UniqueName: \"kubernetes.io/projected/81a453e6-62fe-44b7-bcf8-7cebf1128f04-kube-api-access-vhv2j\") pod \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\" (UID: \"81a453e6-62fe-44b7-bcf8-7cebf1128f04\") " Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.270689 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81a453e6-62fe-44b7-bcf8-7cebf1128f04-config-volume" (OuterVolumeSpecName: "config-volume") pod "81a453e6-62fe-44b7-bcf8-7cebf1128f04" (UID: "81a453e6-62fe-44b7-bcf8-7cebf1128f04"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.275758 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a453e6-62fe-44b7-bcf8-7cebf1128f04-kube-api-access-vhv2j" (OuterVolumeSpecName: "kube-api-access-vhv2j") pod "81a453e6-62fe-44b7-bcf8-7cebf1128f04" (UID: "81a453e6-62fe-44b7-bcf8-7cebf1128f04"). InnerVolumeSpecName "kube-api-access-vhv2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.278208 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81a453e6-62fe-44b7-bcf8-7cebf1128f04-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81a453e6-62fe-44b7-bcf8-7cebf1128f04" (UID: "81a453e6-62fe-44b7-bcf8-7cebf1128f04"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.367534 4823 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a453e6-62fe-44b7-bcf8-7cebf1128f04-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.367578 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhv2j\" (UniqueName: \"kubernetes.io/projected/81a453e6-62fe-44b7-bcf8-7cebf1128f04-kube-api-access-vhv2j\") on node \"crc\" DevicePath \"\"" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.367594 4823 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81a453e6-62fe-44b7-bcf8-7cebf1128f04-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.868906 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" event={"ID":"81a453e6-62fe-44b7-bcf8-7cebf1128f04","Type":"ContainerDied","Data":"561af7b4c27407b7381b5d309f8b54d76782227b292f13aaaa20a940d3f0cb47"} Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.868946 4823 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="561af7b4c27407b7381b5d309f8b54d76782227b292f13aaaa20a940d3f0cb47" Dec 16 09:15:03 crc kubenswrapper[4823]: I1216 09:15:03.868963 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-kjtrp" Dec 16 09:15:04 crc kubenswrapper[4823]: I1216 09:15:04.325317 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6"] Dec 16 09:15:04 crc kubenswrapper[4823]: I1216 09:15:04.325383 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-txxf6"] Dec 16 09:15:05 crc kubenswrapper[4823]: I1216 09:15:05.782623 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49999519-6bb6-47bf-aa92-985e7d038b0c" path="/var/lib/kubelet/pods/49999519-6bb6-47bf-aa92-985e7d038b0c/volumes" Dec 16 09:15:06 crc kubenswrapper[4823]: I1216 09:15:06.671430 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-l2h76_e5260194-8fc8-4615-bfd5-98210220f074/manager/0.log" Dec 16 09:15:06 crc kubenswrapper[4823]: I1216 09:15:06.868955 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/util/0.log" Dec 16 09:15:06 crc kubenswrapper[4823]: I1216 09:15:06.875548 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-l6tn9_e58b0dc3-aa85-4623-bc8b-d2e1dc73dca1/manager/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.051364 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/util/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.052855 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/pull/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.055809 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/pull/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.254957 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/util/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.281182 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/pull/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.289108 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641af9r57m_40af155f-2129-4d23-ab43-96419168bfc8/extract/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.687740 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-fvsjg_37b11baa-1136-4fea-869d-e3d8f98bca83/manager/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.884283 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-tg8ww_62d59368-9ca6-4327-a979-c4c31903630c/manager/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.943541 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-26qs6_ae33bf2e-0415-4ba8-9508-f7c36182aec8/manager/0.log" Dec 16 09:15:07 crc kubenswrapper[4823]: I1216 09:15:07.965743 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-8mh84_62b57d47-be40-449a-8503-b86187f19914/manager/0.log" Dec 16 09:15:08 crc kubenswrapper[4823]: I1216 09:15:08.194306 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-629tj_8a46fccc-0870-4aed-96db-064958d3f0c3/manager/0.log" Dec 16 09:15:08 crc kubenswrapper[4823]: I1216 09:15:08.615219 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-ffflh_862263d5-cd38-4867-a8ce-6a82d3170f48/manager/0.log" Dec 16 09:15:08 crc kubenswrapper[4823]: I1216 09:15:08.615457 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-57b67_e75f878b-9fb0-429b-8d6a-b30b98c1dba5/manager/0.log" Dec 16 09:15:08 crc kubenswrapper[4823]: I1216 09:15:08.691259 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-8rx8h_7f6072b1-7137-4564-9000-aa50b569ceac/manager/0.log" Dec 16 09:15:08 crc kubenswrapper[4823]: I1216 09:15:08.857482 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-btpw8_9b601555-09ee-46de-8736-e28797436673/manager/0.log" Dec 16 09:15:08 crc kubenswrapper[4823]: I1216 09:15:08.902507 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-z7kmz_b3422264-49a2-4032-8906-b74358a9451d/manager/0.log" Dec 16 09:15:09 crc kubenswrapper[4823]: I1216 09:15:09.113468 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-8n9sx_9de24328-0da5-4d0a-a34c-5cd820b35a23/manager/0.log" Dec 16 09:15:09 crc kubenswrapper[4823]: I1216 09:15:09.191321 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-rkn7m_201bb612-805a-4516-b18e-41382e5e4c42/manager/0.log" Dec 16 09:15:09 crc kubenswrapper[4823]: I1216 09:15:09.279795 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66fff4bf6b4m4mt_c69e81b5-99ea-4629-a61e-5d0e012bd472/manager/0.log" Dec 16 09:15:09 crc kubenswrapper[4823]: I1216 09:15:09.599332 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69fc74c8bb-vqqz2_2f5d7edb-4be1-4e7b-bf43-3ceb6c77289d/operator/0.log" Dec 16 09:15:09 crc kubenswrapper[4823]: I1216 09:15:09.790956 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5b7rt_bae4e226-369c-445b-96d3-267079b07732/registry-server/0.log" Dec 16 09:15:09 crc kubenswrapper[4823]: I1216 09:15:09.890629 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-7pmnl_73f9c317-3748-4e9d-a683-1d7fab3949b5/manager/0.log" Dec 16 09:15:10 crc kubenswrapper[4823]: I1216 09:15:10.173916 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-47v54_f390c9c8-73bb-44e7-aa6f-4501691d8415/manager/0.log" Dec 16 09:15:10 crc kubenswrapper[4823]: I1216 09:15:10.232620 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-txmdd_3378ca15-f3fb-410e-a3fe-96b21dfce8d8/operator/0.log" Dec 16 09:15:10 crc kubenswrapper[4823]: I1216 09:15:10.457110 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-f4294_3995bf1e-51ff-4543-ba13-cce941e6caab/manager/0.log" Dec 16 09:15:10 crc kubenswrapper[4823]: I1216 09:15:10.743223 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-dr886_22746af9-8023-44ba-8377-e35e048923fe/manager/0.log" Dec 16 09:15:10 crc kubenswrapper[4823]: I1216 09:15:10.828645 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-wrzfd_f93451fb-312c-448f-a868-43c05626d74a/manager/0.log" Dec 16 09:15:10 crc kubenswrapper[4823]: I1216 09:15:10.904331 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-sgp6f_ebfb35eb-0e15-454b-9f27-27c35373793b/manager/0.log" Dec 16 09:15:11 crc kubenswrapper[4823]: I1216 09:15:11.071641 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-678747d7fb-qbkkw_8fbf843f-253e-46b0-944e-7e4055e7ecdb/manager/0.log" Dec 16 09:15:28 crc kubenswrapper[4823]: I1216 09:15:28.133721 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:15:28 crc kubenswrapper[4823]: I1216 09:15:28.134305 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:15:30 crc kubenswrapper[4823]: I1216 09:15:30.923389 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xkdq6_eff38aa7-d0b3-455b-b0ca-1034fc06a182/control-plane-machine-set-operator/0.log" Dec 16 09:15:31 crc kubenswrapper[4823]: I1216 09:15:31.101076 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bh4xp_150075c3-d2eb-4c87-8b80-cd1d063e7d4c/kube-rbac-proxy/0.log" Dec 16 09:15:31 crc kubenswrapper[4823]: I1216 09:15:31.133862 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bh4xp_150075c3-d2eb-4c87-8b80-cd1d063e7d4c/machine-api-operator/0.log" Dec 16 09:15:44 crc kubenswrapper[4823]: I1216 09:15:44.984679 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-knsxz_33843cca-2433-4a8e-8835-46959d61e521/cert-manager-controller/0.log" Dec 16 09:15:45 crc kubenswrapper[4823]: I1216 09:15:45.170221 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-slqx8_080d4cb1-f1c1-4fe3-ab3b-2c2e621d5a59/cert-manager-cainjector/0.log" Dec 16 09:15:45 crc kubenswrapper[4823]: I1216 09:15:45.193677 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-ss7qt_35cdac9f-df23-48a6-93e5-83ff4cca639e/cert-manager-webhook/0.log" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.133610 4823 patch_prober.go:28] interesting pod/machine-config-daemon-fv56f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.134092 4823 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.134138 4823 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.134768 4823 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722"} pod="openshift-machine-config-operator/machine-config-daemon-fv56f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.134820 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" containerName="machine-config-daemon" containerID="cri-o://2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" gracePeriod=600 Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.211606 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-shs4v_fa79bc1b-809d-4838-b3b1-be4a70b7fefa/nmstate-console-plugin/0.log" Dec 16 09:15:58 crc kubenswrapper[4823]: E1216 09:15:58.258210 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.299509 4823 generic.go:334] "Generic (PLEG): container finished" podID="25dec47c-3043-486c-b371-2be103c214e3" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" exitCode=0 Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.299575 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" event={"ID":"25dec47c-3043-486c-b371-2be103c214e3","Type":"ContainerDied","Data":"2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722"} Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.299620 4823 scope.go:117] "RemoveContainer" containerID="f15e2940e276ab8f86caed82382b79b3b000ee860b1e320554c03e2678c0b4b5" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.300140 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:15:58 crc kubenswrapper[4823]: E1216 09:15:58.300398 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.451514 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kpqll_3ce6c2b4-7bea-47c9-bb3e-dd48f14696d9/nmstate-handler/0.log" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.499242 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-prt24_55511fba-6d01-4f25-af58-a1ea0e39bb95/kube-rbac-proxy/0.log" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.602884 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-prt24_55511fba-6d01-4f25-af58-a1ea0e39bb95/nmstate-metrics/0.log" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.706237 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-npfvt_0b896898-046a-48fa-bf48-cab19132c8e2/nmstate-operator/0.log" Dec 16 09:15:58 crc kubenswrapper[4823]: I1216 09:15:58.823496 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-4krvw_f13f0b02-300a-4220-9342-e1cae01493b3/nmstate-webhook/0.log" Dec 16 09:16:01 crc kubenswrapper[4823]: I1216 09:16:01.015996 4823 scope.go:117] "RemoveContainer" containerID="5495bb782a9171cd86d5cb16369710c1a6d2b47a6f3e5031d1a6ab602dacd3d3" Dec 16 09:16:01 crc kubenswrapper[4823]: I1216 09:16:01.035339 4823 scope.go:117] "RemoveContainer" containerID="883456fb5995de0292ae9252c5a92710d3629466700f6e999e7eecdce8b31a78" Dec 16 09:16:11 crc kubenswrapper[4823]: I1216 09:16:11.780609 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:16:11 crc kubenswrapper[4823]: E1216 09:16:11.781536 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:16:13 crc kubenswrapper[4823]: I1216 09:16:13.639872 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-cn66r_2620ac46-bf4c-4672-aee2-17d87685b2b9/kube-rbac-proxy/0.log" Dec 16 09:16:13 crc kubenswrapper[4823]: I1216 09:16:13.869163 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-frr-files/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.070319 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-reloader/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.122879 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-cn66r_2620ac46-bf4c-4672-aee2-17d87685b2b9/controller/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.135226 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-frr-files/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.157828 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-metrics/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.292717 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-reloader/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.443732 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-frr-files/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.448985 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-metrics/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.454978 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-reloader/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.487209 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-metrics/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.640685 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-frr-files/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.662155 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-metrics/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.662271 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/cp-reloader/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.676042 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/controller/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.849383 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/kube-rbac-proxy/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.903553 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/kube-rbac-proxy-frr/0.log" Dec 16 09:16:14 crc kubenswrapper[4823]: I1216 09:16:14.920860 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/frr-metrics/0.log" Dec 16 09:16:15 crc kubenswrapper[4823]: I1216 09:16:15.070686 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/reloader/0.log" Dec 16 09:16:15 crc kubenswrapper[4823]: I1216 09:16:15.115819 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-jm7hm_6f9ee161-6a5b-47a4-b15e-d3f9d1d7a068/frr-k8s-webhook-server/0.log" Dec 16 09:16:15 crc kubenswrapper[4823]: I1216 09:16:15.350959 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d4c58b6db-plllb_0c6a98e7-03f3-402e-ae6e-18c7b2a09ead/manager/0.log" Dec 16 09:16:15 crc kubenswrapper[4823]: I1216 09:16:15.491984 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86d44cc785-ftsr4_7e3777ec-c803-4417-8381-86fb3ad02265/webhook-server/0.log" Dec 16 09:16:15 crc kubenswrapper[4823]: I1216 09:16:15.559577 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dftnv_1b2d484a-d9e3-4272-a080-a0439423997a/kube-rbac-proxy/0.log" Dec 16 09:16:16 crc kubenswrapper[4823]: I1216 09:16:16.507508 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dftnv_1b2d484a-d9e3-4272-a080-a0439423997a/speaker/0.log" Dec 16 09:16:18 crc kubenswrapper[4823]: I1216 09:16:18.281608 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sdgmf_69276924-096b-4e93-9397-08095f966062/frr/0.log" Dec 16 09:16:24 crc kubenswrapper[4823]: I1216 09:16:24.772115 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:16:24 crc kubenswrapper[4823]: E1216 09:16:24.772926 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.203522 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/util/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.348556 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/util/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.409518 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/pull/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.412647 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/pull/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.559698 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/pull/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.561846 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/util/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.605596 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a9dmv6_168f9fd9-a3ba-4664-8d70-2b46c6c66071/extract/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.745009 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/util/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.908842 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/pull/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.920635 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/util/0.log" Dec 16 09:16:30 crc kubenswrapper[4823]: I1216 09:16:30.958304 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/pull/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.111820 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/util/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.130740 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/extract/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.148736 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4xv55w_3dd04c34-7f81-4773-a28b-0660e24aeb5d/pull/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.285090 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/util/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.445469 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/pull/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.454414 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/pull/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.455243 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/util/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.648617 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/pull/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.668675 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/extract/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.671311 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102khnm_66ba22e0-25ea-4ff8-8114-642abebbca90/util/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.815645 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/util/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.968328 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/util/0.log" Dec 16 09:16:31 crc kubenswrapper[4823]: I1216 09:16:31.989930 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/pull/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.016312 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/pull/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.168917 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/util/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.178699 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/extract/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.190960 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8czvjc_2a1962cd-dfaf-404b-8feb-44ee984181a7/pull/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.338548 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/extract-utilities/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.480315 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/extract-utilities/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.490208 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/extract-content/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.510862 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/extract-content/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.664249 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/extract-content/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.673763 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/extract-utilities/0.log" Dec 16 09:16:32 crc kubenswrapper[4823]: I1216 09:16:32.871767 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/extract-utilities/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.120491 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/extract-utilities/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.138183 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/extract-content/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.153530 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/extract-content/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.468850 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/extract-utilities/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.469156 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/extract-content/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.649938 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j5hk4_105c5da3-e305-4f41-968c-19466291e660/marketplace-operator/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.810774 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/extract-utilities/0.log" Dec 16 09:16:33 crc kubenswrapper[4823]: I1216 09:16:33.864460 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-46558_2582ab05-12e8-48c6-ac08-2673b110e34f/registry-server/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.066791 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/extract-content/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.086503 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/extract-utilities/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.109491 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/extract-content/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.227823 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/extract-utilities/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.272414 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/extract-content/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.417352 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fb624_639299af-31e8-4fbc-9b06-5b45178ab1e1/registry-server/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.432552 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/extract-utilities/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.637690 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c8h86_d1a83968-f624-48a1-a47a-3ad405b3b53c/registry-server/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.677170 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/extract-utilities/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.678391 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/extract-content/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.683862 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/extract-content/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.857049 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/extract-utilities/0.log" Dec 16 09:16:34 crc kubenswrapper[4823]: I1216 09:16:34.871924 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/extract-content/0.log" Dec 16 09:16:35 crc kubenswrapper[4823]: I1216 09:16:35.821130 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wnh5n_7615bc49-d6d1-4933-8ef3-f7d871a8d4b8/registry-server/0.log" Dec 16 09:16:38 crc kubenswrapper[4823]: I1216 09:16:38.772353 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:16:38 crc kubenswrapper[4823]: E1216 09:16:38.772906 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:16:44 crc kubenswrapper[4823]: I1216 09:16:44.628640 4823 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-sb-1" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="ovsdbserver-sb" containerID="cri-o://d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b" gracePeriod=300 Dec 16 09:16:44 crc kubenswrapper[4823]: I1216 09:16:44.629080 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-1" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="ovsdbserver-sb" containerID="cri-o://d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b" gracePeriod=2 Dec 16 09:16:44 crc kubenswrapper[4823]: E1216 09:16:44.637893 4823 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 09:16:44 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=sb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ sb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-sb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-sb" pod="openstack/ovsdbserver-sb-1" message=< Dec 16 09:16:44 crc kubenswrapper[4823]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=sb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ sb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-sb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: > Dec 16 09:16:44 crc kubenswrapper[4823]: E1216 09:16:44.638232 4823 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 09:16:44 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=sb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ sb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-sb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: > pod="openstack/ovsdbserver-sb-1" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="ovsdbserver-sb" containerID="cri-o://d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b" Dec 16 09:16:44 crc kubenswrapper[4823]: I1216 09:16:44.711512 4823 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-sb-2" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" containerID="cri-o://b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" gracePeriod=300 Dec 16 09:16:44 crc kubenswrapper[4823]: I1216 09:16:44.711566 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-2" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" containerID="cri-o://b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" gracePeriod=2 Dec 16 09:16:44 crc kubenswrapper[4823]: E1216 09:16:44.726078 4823 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 09:16:44 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=sb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ sb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-sb" pod="openstack/ovsdbserver-sb-2" message=< Dec 16 09:16:44 crc kubenswrapper[4823]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=sb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ sb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: > Dec 16 09:16:44 crc kubenswrapper[4823]: E1216 09:16:44.726590 4823 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 09:16:44 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=sb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ sb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: > pod="openstack/ovsdbserver-sb-2" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" containerID="cri-o://b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" Dec 16 09:16:44 crc kubenswrapper[4823]: I1216 09:16:44.817709 4823 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-nb-1" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="ovsdbserver-nb" containerID="cri-o://2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5" gracePeriod=300 Dec 16 09:16:44 crc kubenswrapper[4823]: I1216 09:16:44.817825 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-1" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="ovsdbserver-nb" containerID="cri-o://2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5" gracePeriod=2 Dec 16 09:16:44 crc kubenswrapper[4823]: E1216 09:16:44.828947 4823 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 09:16:44 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:44 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_TYPE=nb Dec 16 09:16:44 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Dec 16 09:16:44 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ nb == \s\b ]] Dec 16 09:16:44 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:44 crc kubenswrapper[4823]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Dec 16 09:16:44 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:44 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:44 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:44 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:44 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:44 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:44 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:44 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-nb" pod="openstack/ovsdbserver-nb-1" message=< Dec 16 09:16:45 crc kubenswrapper[4823]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:45 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_TYPE=nb Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Dec 16 09:16:45 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ nb == \s\b ]] Dec 16 09:16:45 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Dec 16 09:16:45 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: > Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:44.829556 4823 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 09:16:45 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:45 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_TYPE=nb Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Dec 16 09:16:45 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ nb == \s\b ]] Dec 16 09:16:45 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Dec 16 09:16:45 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: > pod="openstack/ovsdbserver-nb-1" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="ovsdbserver-nb" containerID="cri-o://2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:44.907724 4823 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-nb-2" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="ovsdbserver-nb" containerID="cri-o://10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7" gracePeriod=300 Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:44.907799 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-2" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="ovsdbserver-nb" containerID="cri-o://10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7" gracePeriod=2 Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:44.916571 4823 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 09:16:45 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:45 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_TYPE=nb Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Dec 16 09:16:45 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ nb == \s\b ]] Dec 16 09:16:45 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Dec 16 09:16:45 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-nb" pod="openstack/ovsdbserver-nb-2" message=< Dec 16 09:16:45 crc kubenswrapper[4823]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:45 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_TYPE=nb Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Dec 16 09:16:45 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ nb == \s\b ]] Dec 16 09:16:45 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Dec 16 09:16:45 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: > Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:44.917045 4823 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 09:16:45 crc kubenswrapper[4823]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Dec 16 09:16:45 crc kubenswrapper[4823]: + source /usr/local/bin/container-scripts/functions Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_TYPE=nb Dec 16 09:16:45 crc kubenswrapper[4823]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Dec 16 09:16:45 crc kubenswrapper[4823]: + DB_NAME=OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ nb == \s\b ]] Dec 16 09:16:45 crc kubenswrapper[4823]: ++ hostname Dec 16 09:16:45 crc kubenswrapper[4823]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Dec 16 09:16:45 crc kubenswrapper[4823]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: + true Dec 16 09:16:45 crc kubenswrapper[4823]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Dec 16 09:16:45 crc kubenswrapper[4823]: ++ grep Status: Dec 16 09:16:45 crc kubenswrapper[4823]: ++ awk -e '{print $2}' Dec 16 09:16:45 crc kubenswrapper[4823]: + STATUS=leaving Dec 16 09:16:45 crc kubenswrapper[4823]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Dec 16 09:16:45 crc kubenswrapper[4823]: + sleep 1 Dec 16 09:16:45 crc kubenswrapper[4823]: > pod="openstack/ovsdbserver-nb-2" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="ovsdbserver-nb" containerID="cri-o://10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.034522 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6c99b5e4-de24-426d-9a97-05fdcbe37141/ovsdbserver-sb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.034577 4823 generic.go:334] "Generic (PLEG): container finished" podID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerID="d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b" exitCode=143 Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.034643 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6c99b5e4-de24-426d-9a97-05fdcbe37141","Type":"ContainerDied","Data":"d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b"} Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.041750 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_dc75b889-6dc5-462d-a589-50f705ffd78f/ovsdbserver-sb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.041800 4823 generic.go:334] "Generic (PLEG): container finished" podID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerID="b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" exitCode=143 Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.041845 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"dc75b889-6dc5-462d-a589-50f705ffd78f","Type":"ContainerDied","Data":"b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab"} Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.044115 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7e1d3682-8130-4fa4-aab4-ade2ac069d2e/ovsdbserver-nb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.044144 4823 generic.go:334] "Generic (PLEG): container finished" podID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerID="2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5" exitCode=143 Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.044181 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7e1d3682-8130-4fa4-aab4-ade2ac069d2e","Type":"ContainerDied","Data":"2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5"} Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.047355 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_6353e69a-5c31-41c9-9d05-2b958aa6a79f/ovsdbserver-nb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.047377 4823 generic.go:334] "Generic (PLEG): container finished" podID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerID="10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7" exitCode=143 Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.047393 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"6353e69a-5c31-41c9-9d05-2b958aa6a79f","Type":"ContainerDied","Data":"10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7"} Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.604174 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6c99b5e4-de24-426d-9a97-05fdcbe37141/ovsdbserver-sb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.604254 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:45.606043 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab is running failed: container process not found" containerID="b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:45.606378 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab is running failed: container process not found" containerID="b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:45.606589 4823 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab is running failed: container process not found" containerID="b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 16 09:16:45 crc kubenswrapper[4823]: E1216 09:16:45.606611 4823 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-2" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.645677 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_6353e69a-5c31-41c9-9d05-2b958aa6a79f/ovsdbserver-nb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.645944 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.662297 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_dc75b889-6dc5-462d-a589-50f705ffd78f/ovsdbserver-sb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.662370 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.683384 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/6353e69a-5c31-41c9-9d05-2b958aa6a79f-kube-api-access-xqfh6\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.683428 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49rnz\" (UniqueName: \"kubernetes.io/projected/6c99b5e4-de24-426d-9a97-05fdcbe37141-kube-api-access-49rnz\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.683480 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-config\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684667 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdbserver-sb-tls-certs\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684707 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-config\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684729 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-combined-ca-bundle\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684754 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-metrics-certs-tls-certs\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684771 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-metrics-certs-tls-certs\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684789 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdbserver-sb-tls-certs\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.684805 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9b2q\" (UniqueName: \"kubernetes.io/projected/dc75b889-6dc5-462d-a589-50f705ffd78f-kube-api-access-f9b2q\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685213 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-config" (OuterVolumeSpecName: "config") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685522 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685556 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-combined-ca-bundle\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685583 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-scripts\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685603 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdb-rundir\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685619 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdbserver-nb-tls-certs\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.685640 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdb-rundir\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.686319 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.687199 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-config" (OuterVolumeSpecName: "config") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.689482 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.689521 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdb-rundir\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.689542 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-scripts\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.689857 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.689874 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.690244 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-scripts" (OuterVolumeSpecName: "scripts") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.696262 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.696621 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.696953 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.697198 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-scripts" (OuterVolumeSpecName: "scripts") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.703178 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc75b889-6dc5-462d-a589-50f705ffd78f-kube-api-access-f9b2q" (OuterVolumeSpecName: "kube-api-access-f9b2q") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "kube-api-access-f9b2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.703550 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6353e69a-5c31-41c9-9d05-2b958aa6a79f-kube-api-access-xqfh6" (OuterVolumeSpecName: "kube-api-access-xqfh6") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "kube-api-access-xqfh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.707996 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c99b5e4-de24-426d-9a97-05fdcbe37141-kube-api-access-49rnz" (OuterVolumeSpecName: "kube-api-access-49rnz") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "kube-api-access-49rnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.724373 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.725708 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.734986 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.735557 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.740073 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7e1d3682-8130-4fa4-aab4-ade2ac069d2e/ovsdbserver-nb/0.log" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.740170 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.746127 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.761201 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.792323 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pjtt\" (UniqueName: \"kubernetes.io/projected/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-kube-api-access-7pjtt\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.792398 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-metrics-certs-tls-certs\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.792506 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdbserver-nb-tls-certs\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.793906 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-scripts\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.793947 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-config\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.793997 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-scripts\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.794039 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-combined-ca-bundle\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.794064 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-combined-ca-bundle\") pod \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\" (UID: \"6353e69a-5c31-41c9-9d05-2b958aa6a79f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.794093 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-config\") pod \"dc75b889-6dc5-462d-a589-50f705ffd78f\" (UID: \"dc75b889-6dc5-462d-a589-50f705ffd78f\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.794245 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.794276 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdb-rundir\") pod \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\" (UID: \"7e1d3682-8130-4fa4-aab4-ade2ac069d2e\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.794298 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-metrics-certs-tls-certs\") pod \"6c99b5e4-de24-426d-9a97-05fdcbe37141\" (UID: \"6c99b5e4-de24-426d-9a97-05fdcbe37141\") " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795152 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c99b5e4-de24-426d-9a97-05fdcbe37141-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795180 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795191 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795216 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") on node \"crc\" " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795237 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") on node \"crc\" " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795247 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795261 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796072 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqfh6\" (UniqueName: \"kubernetes.io/projected/6353e69a-5c31-41c9-9d05-2b958aa6a79f-kube-api-access-xqfh6\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796086 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49rnz\" (UniqueName: \"kubernetes.io/projected/6c99b5e4-de24-426d-9a97-05fdcbe37141-kube-api-access-49rnz\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796097 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796111 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796120 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9b2q\" (UniqueName: \"kubernetes.io/projected/dc75b889-6dc5-462d-a589-50f705ffd78f-kube-api-access-f9b2q\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796141 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") on node \"crc\" " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.796157 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.795803 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.797806 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-kube-api-access-7pjtt" (OuterVolumeSpecName: "kube-api-access-7pjtt") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "kube-api-access-7pjtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.799378 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-config" (OuterVolumeSpecName: "config") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.799781 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-scripts" (OuterVolumeSpecName: "scripts") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.800979 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.801473 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-config" (OuterVolumeSpecName: "config") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.801236 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-scripts" (OuterVolumeSpecName: "scripts") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.827007 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "pvc-20321045-c94b-408e-81f6-d22070c77447". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.836716 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.840648 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.840661 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.840768 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d") on node "crc" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.840941 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.848163 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.848309 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03") on node "crc" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.852779 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.852925 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb") on node "crc" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.852941 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6353e69a-5c31-41c9-9d05-2b958aa6a79f" (UID: "6353e69a-5c31-41c9-9d05-2b958aa6a79f"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.865874 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.868965 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "dc75b889-6dc5-462d-a589-50f705ffd78f" (UID: "dc75b889-6dc5-462d-a589-50f705ffd78f"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.876267 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7e1d3682-8130-4fa4-aab4-ade2ac069d2e" (UID: "7e1d3682-8130-4fa4-aab4-ade2ac069d2e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897368 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897393 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897403 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a26bacc-8a4e-496b-bde2-446a41ec7f03\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897414 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-90d11ead-6d5b-4d27-8b2b-082a68583c3d\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897425 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897434 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897442 4823 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6353e69a-5c31-41c9-9d05-2b958aa6a79f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897451 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897460 4823 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897469 4823 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc75b889-6dc5-462d-a589-50f705ffd78f-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897502 4823 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-20321045-c94b-408e-81f6-d22070c77447\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") on node \"crc\" " Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897513 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897526 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc75b889-6dc5-462d-a589-50f705ffd78f-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897537 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pjtt\" (UniqueName: \"kubernetes.io/projected/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-kube-api-access-7pjtt\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897545 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e1d3682-8130-4fa4-aab4-ade2ac069d2e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897554 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6353e69a-5c31-41c9-9d05-2b958aa6a79f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897567 4823 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.897576 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-464a6eca-8ad0-46e3-8e7b-4ed9c1986afb\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.901863 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6c99b5e4-de24-426d-9a97-05fdcbe37141" (UID: "6c99b5e4-de24-426d-9a97-05fdcbe37141"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.914384 4823 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.914532 4823 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-20321045-c94b-408e-81f6-d22070c77447" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447") on node "crc" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.999018 4823 reconciler_common.go:293] "Volume detached for volume \"pvc-20321045-c94b-408e-81f6-d22070c77447\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20321045-c94b-408e-81f6-d22070c77447\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:45 crc kubenswrapper[4823]: I1216 09:16:45.999380 4823 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c99b5e4-de24-426d-9a97-05fdcbe37141-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.058106 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_dc75b889-6dc5-462d-a589-50f705ffd78f/ovsdbserver-sb/0.log" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.058238 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.058486 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"dc75b889-6dc5-462d-a589-50f705ffd78f","Type":"ContainerDied","Data":"6fe56092f831488b7e2b9906b1e4c85664acf1c1917e2265341a547fd64e4cdc"} Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.058647 4823 scope.go:117] "RemoveContainer" containerID="6ffc44659c4af61247f8ff482db564b84d825032d9d7789049d8414a5dd9a687" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.060738 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7e1d3682-8130-4fa4-aab4-ade2ac069d2e/ovsdbserver-nb/0.log" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.060812 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7e1d3682-8130-4fa4-aab4-ade2ac069d2e","Type":"ContainerDied","Data":"58a1ed64c1ec667db0652f3f8c47b70f0df19a28f9ba119a3a4a5c12c49c63d1"} Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.060854 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.064668 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_6353e69a-5c31-41c9-9d05-2b958aa6a79f/ovsdbserver-nb/0.log" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.064764 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"6353e69a-5c31-41c9-9d05-2b958aa6a79f","Type":"ContainerDied","Data":"dcc665d2286cee23a5b5de32e11127b5d6045c3da4ce322ae2938df6e68db2af"} Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.064791 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.069449 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6c99b5e4-de24-426d-9a97-05fdcbe37141/ovsdbserver-sb/0.log" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.069598 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6c99b5e4-de24-426d-9a97-05fdcbe37141","Type":"ContainerDied","Data":"5bed026074a251bc904029bd61e5b94db39ebac6155e0235d43832673edcc33b"} Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.069751 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.091988 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.096216 4823 scope.go:117] "RemoveContainer" containerID="b83530bf76668f8bf22e0d206703a2c7ec87906449e3b0e5fc0196514eab70ab" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.099158 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.112586 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.121142 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.123362 4823 scope.go:117] "RemoveContainer" containerID="bf3c177aa7f060a204b18065f9ace154c320ecfa12044e459e50aad48defa022" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.128525 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.136271 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.143895 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.149315 4823 scope.go:117] "RemoveContainer" containerID="2430c406e51c479baf9de2cdddd7f35d4cb9b052428eb9894c3134ee623601b5" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.150705 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.173194 4823 scope.go:117] "RemoveContainer" containerID="b324eb19678c78b1a0e6949df42fbbb0f0364093edcb56f5eafa24f8062aff0e" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.199325 4823 scope.go:117] "RemoveContainer" containerID="10d23fcf40d14824e8bcbbb27688c846b6ace21845beeab00416051c6361ffe7" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.216518 4823 scope.go:117] "RemoveContainer" containerID="94d634a132c1bc025be0422b3756b78f98c01350c4a640621b04c9ead2558605" Dec 16 09:16:46 crc kubenswrapper[4823]: I1216 09:16:46.239607 4823 scope.go:117] "RemoveContainer" containerID="d06360657e9e0d4e61ed0bba6b0ba1b231c8900c0ee15d84fd10f3823299aa4b" Dec 16 09:16:47 crc kubenswrapper[4823]: I1216 09:16:47.782646 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" path="/var/lib/kubelet/pods/6353e69a-5c31-41c9-9d05-2b958aa6a79f/volumes" Dec 16 09:16:47 crc kubenswrapper[4823]: I1216 09:16:47.783803 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" path="/var/lib/kubelet/pods/6c99b5e4-de24-426d-9a97-05fdcbe37141/volumes" Dec 16 09:16:47 crc kubenswrapper[4823]: I1216 09:16:47.800359 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" path="/var/lib/kubelet/pods/7e1d3682-8130-4fa4-aab4-ade2ac069d2e/volumes" Dec 16 09:16:47 crc kubenswrapper[4823]: I1216 09:16:47.801155 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" path="/var/lib/kubelet/pods/dc75b889-6dc5-462d-a589-50f705ffd78f/volumes" Dec 16 09:16:47 crc kubenswrapper[4823]: I1216 09:16:47.898431 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-flrxh_07fa5706-2a14-40f7-ac5a-cc229d35055d/prometheus-operator/0.log" Dec 16 09:16:48 crc kubenswrapper[4823]: I1216 09:16:48.017325 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-84dcbb7858-k6gks_742e0cec-7370-4a35-90b8-64b2da24c464/prometheus-operator-admission-webhook/0.log" Dec 16 09:16:48 crc kubenswrapper[4823]: I1216 09:16:48.103202 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-84dcbb7858-ln8wq_1ac408b5-8185-47a6-bdd4-33cc8d6906f3/prometheus-operator-admission-webhook/0.log" Dec 16 09:16:48 crc kubenswrapper[4823]: I1216 09:16:48.180308 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-pkkpr_e554ec4f-2a7c-419b-9346-294c8026d503/operator/0.log" Dec 16 09:16:48 crc kubenswrapper[4823]: I1216 09:16:48.263879 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-26zgb_0a30c48a-65a8-4a6d-bf5d-106cb7ce567d/perses-operator/0.log" Dec 16 09:16:50 crc kubenswrapper[4823]: I1216 09:16:50.771675 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:16:50 crc kubenswrapper[4823]: E1216 09:16:50.772080 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.675304 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wtcj2"] Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.675961 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.675974 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.675991 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.675998 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676007 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676013 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676057 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676064 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676074 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="ovsdbserver-sb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676079 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="ovsdbserver-sb" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676088 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676094 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676105 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="ovsdbserver-nb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676111 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="ovsdbserver-nb" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676125 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a453e6-62fe-44b7-bcf8-7cebf1128f04" containerName="collect-profiles" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676130 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a453e6-62fe-44b7-bcf8-7cebf1128f04" containerName="collect-profiles" Dec 16 09:16:52 crc kubenswrapper[4823]: E1216 09:16:52.676139 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="ovsdbserver-nb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676144 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="ovsdbserver-nb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676295 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676304 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676318 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6353e69a-5c31-41c9-9d05-2b958aa6a79f" containerName="ovsdbserver-nb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676328 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="ovsdbserver-sb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676336 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c99b5e4-de24-426d-9a97-05fdcbe37141" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676348 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc75b889-6dc5-462d-a589-50f705ffd78f" containerName="ovsdbserver-sb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676356 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="openstack-network-exporter" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676365 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1d3682-8130-4fa4-aab4-ade2ac069d2e" containerName="ovsdbserver-nb" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.676376 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a453e6-62fe-44b7-bcf8-7cebf1128f04" containerName="collect-profiles" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.677347 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.696284 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-catalog-content\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.696507 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rlkd\" (UniqueName: \"kubernetes.io/projected/786abce2-7486-464a-9d32-c519de82dfbc-kube-api-access-8rlkd\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.696612 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-utilities\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.701313 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtcj2"] Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.798344 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rlkd\" (UniqueName: \"kubernetes.io/projected/786abce2-7486-464a-9d32-c519de82dfbc-kube-api-access-8rlkd\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.798404 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-utilities\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.798983 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-utilities\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.799050 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-catalog-content\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.799058 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-catalog-content\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.833531 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rlkd\" (UniqueName: \"kubernetes.io/projected/786abce2-7486-464a-9d32-c519de82dfbc-kube-api-access-8rlkd\") pod \"redhat-marketplace-wtcj2\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:52 crc kubenswrapper[4823]: I1216 09:16:52.997581 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:16:53 crc kubenswrapper[4823]: I1216 09:16:53.489440 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtcj2"] Dec 16 09:16:54 crc kubenswrapper[4823]: I1216 09:16:54.167110 4823 generic.go:334] "Generic (PLEG): container finished" podID="786abce2-7486-464a-9d32-c519de82dfbc" containerID="8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1" exitCode=0 Dec 16 09:16:54 crc kubenswrapper[4823]: I1216 09:16:54.167260 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtcj2" event={"ID":"786abce2-7486-464a-9d32-c519de82dfbc","Type":"ContainerDied","Data":"8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1"} Dec 16 09:16:54 crc kubenswrapper[4823]: I1216 09:16:54.167508 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtcj2" event={"ID":"786abce2-7486-464a-9d32-c519de82dfbc","Type":"ContainerStarted","Data":"aba774fbaa85cf4d75f9d90f5738cbd76f321e8fd8e259729e9852b9e82ba945"} Dec 16 09:16:54 crc kubenswrapper[4823]: I1216 09:16:54.171100 4823 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:16:56 crc kubenswrapper[4823]: I1216 09:16:56.189709 4823 generic.go:334] "Generic (PLEG): container finished" podID="786abce2-7486-464a-9d32-c519de82dfbc" containerID="f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402" exitCode=0 Dec 16 09:16:56 crc kubenswrapper[4823]: I1216 09:16:56.189998 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtcj2" event={"ID":"786abce2-7486-464a-9d32-c519de82dfbc","Type":"ContainerDied","Data":"f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402"} Dec 16 09:16:58 crc kubenswrapper[4823]: I1216 09:16:58.212683 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtcj2" event={"ID":"786abce2-7486-464a-9d32-c519de82dfbc","Type":"ContainerStarted","Data":"c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1"} Dec 16 09:16:58 crc kubenswrapper[4823]: I1216 09:16:58.239966 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wtcj2" podStartSLOduration=3.160517774 podStartE2EDuration="6.239944752s" podCreationTimestamp="2025-12-16 09:16:52 +0000 UTC" firstStartedPulling="2025-12-16 09:16:54.170480017 +0000 UTC m=+8492.659046180" lastFinishedPulling="2025-12-16 09:16:57.249906995 +0000 UTC m=+8495.738473158" observedRunningTime="2025-12-16 09:16:58.232900021 +0000 UTC m=+8496.721466184" watchObservedRunningTime="2025-12-16 09:16:58.239944752 +0000 UTC m=+8496.728510875" Dec 16 09:17:01 crc kubenswrapper[4823]: I1216 09:17:01.778210 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:17:01 crc kubenswrapper[4823]: E1216 09:17:01.778825 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:17:02 crc kubenswrapper[4823]: I1216 09:17:02.999300 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:17:03 crc kubenswrapper[4823]: I1216 09:17:03.000269 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:17:03 crc kubenswrapper[4823]: I1216 09:17:03.058529 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:17:03 crc kubenswrapper[4823]: I1216 09:17:03.329681 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:17:03 crc kubenswrapper[4823]: I1216 09:17:03.403406 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtcj2"] Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.281839 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wtcj2" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="registry-server" containerID="cri-o://c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1" gracePeriod=2 Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.807754 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.841344 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-catalog-content\") pod \"786abce2-7486-464a-9d32-c519de82dfbc\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.841419 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rlkd\" (UniqueName: \"kubernetes.io/projected/786abce2-7486-464a-9d32-c519de82dfbc-kube-api-access-8rlkd\") pod \"786abce2-7486-464a-9d32-c519de82dfbc\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.841450 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-utilities\") pod \"786abce2-7486-464a-9d32-c519de82dfbc\" (UID: \"786abce2-7486-464a-9d32-c519de82dfbc\") " Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.842562 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-utilities" (OuterVolumeSpecName: "utilities") pod "786abce2-7486-464a-9d32-c519de82dfbc" (UID: "786abce2-7486-464a-9d32-c519de82dfbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.854268 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786abce2-7486-464a-9d32-c519de82dfbc-kube-api-access-8rlkd" (OuterVolumeSpecName: "kube-api-access-8rlkd") pod "786abce2-7486-464a-9d32-c519de82dfbc" (UID: "786abce2-7486-464a-9d32-c519de82dfbc"). InnerVolumeSpecName "kube-api-access-8rlkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.881047 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "786abce2-7486-464a-9d32-c519de82dfbc" (UID: "786abce2-7486-464a-9d32-c519de82dfbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.943176 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.943214 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rlkd\" (UniqueName: \"kubernetes.io/projected/786abce2-7486-464a-9d32-c519de82dfbc-kube-api-access-8rlkd\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:05 crc kubenswrapper[4823]: I1216 09:17:05.943231 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/786abce2-7486-464a-9d32-c519de82dfbc-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.292671 4823 generic.go:334] "Generic (PLEG): container finished" podID="786abce2-7486-464a-9d32-c519de82dfbc" containerID="c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1" exitCode=0 Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.292732 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtcj2" event={"ID":"786abce2-7486-464a-9d32-c519de82dfbc","Type":"ContainerDied","Data":"c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1"} Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.292771 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtcj2" event={"ID":"786abce2-7486-464a-9d32-c519de82dfbc","Type":"ContainerDied","Data":"aba774fbaa85cf4d75f9d90f5738cbd76f321e8fd8e259729e9852b9e82ba945"} Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.292799 4823 scope.go:117] "RemoveContainer" containerID="c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.292977 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtcj2" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.313142 4823 scope.go:117] "RemoveContainer" containerID="f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.339624 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtcj2"] Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.351127 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtcj2"] Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.361520 4823 scope.go:117] "RemoveContainer" containerID="8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.377704 4823 scope.go:117] "RemoveContainer" containerID="c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1" Dec 16 09:17:06 crc kubenswrapper[4823]: E1216 09:17:06.378182 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1\": container with ID starting with c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1 not found: ID does not exist" containerID="c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.378224 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1"} err="failed to get container status \"c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1\": rpc error: code = NotFound desc = could not find container \"c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1\": container with ID starting with c1ce8c4369f82b940965bc7f7b182e1c1ec011579b33643ae678794c0f9571f1 not found: ID does not exist" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.378249 4823 scope.go:117] "RemoveContainer" containerID="f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402" Dec 16 09:17:06 crc kubenswrapper[4823]: E1216 09:17:06.378721 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402\": container with ID starting with f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402 not found: ID does not exist" containerID="f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.378746 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402"} err="failed to get container status \"f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402\": rpc error: code = NotFound desc = could not find container \"f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402\": container with ID starting with f7861f64003001d17103aefeff8bf97fa6b0ef0bfab7041b878cce32d875e402 not found: ID does not exist" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.378764 4823 scope.go:117] "RemoveContainer" containerID="8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1" Dec 16 09:17:06 crc kubenswrapper[4823]: E1216 09:17:06.379167 4823 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1\": container with ID starting with 8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1 not found: ID does not exist" containerID="8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1" Dec 16 09:17:06 crc kubenswrapper[4823]: I1216 09:17:06.379209 4823 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1"} err="failed to get container status \"8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1\": rpc error: code = NotFound desc = could not find container \"8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1\": container with ID starting with 8ebe642a8460ae1af9aee44aee44776e4919b9a52cea0fadcaac0bd7189352a1 not found: ID does not exist" Dec 16 09:17:07 crc kubenswrapper[4823]: I1216 09:17:07.791562 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="786abce2-7486-464a-9d32-c519de82dfbc" path="/var/lib/kubelet/pods/786abce2-7486-464a-9d32-c519de82dfbc/volumes" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.633751 4823 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pjmmh"] Dec 16 09:17:13 crc kubenswrapper[4823]: E1216 09:17:13.634840 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="extract-utilities" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.634862 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="extract-utilities" Dec 16 09:17:13 crc kubenswrapper[4823]: E1216 09:17:13.634901 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="registry-server" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.634913 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="registry-server" Dec 16 09:17:13 crc kubenswrapper[4823]: E1216 09:17:13.634933 4823 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="extract-content" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.634943 4823 state_mem.go:107] "Deleted CPUSet assignment" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="extract-content" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.635192 4823 memory_manager.go:354] "RemoveStaleState removing state" podUID="786abce2-7486-464a-9d32-c519de82dfbc" containerName="registry-server" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.637901 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.650638 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjmmh"] Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.680635 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t45b5\" (UniqueName: \"kubernetes.io/projected/8905c971-c3ea-4619-83b2-0aa95bca4bcd-kube-api-access-t45b5\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.680758 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-catalog-content\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.680834 4823 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-utilities\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.782212 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-utilities\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.782736 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-utilities\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.783712 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t45b5\" (UniqueName: \"kubernetes.io/projected/8905c971-c3ea-4619-83b2-0aa95bca4bcd-kube-api-access-t45b5\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.783817 4823 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-catalog-content\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.784287 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-catalog-content\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.809646 4823 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t45b5\" (UniqueName: \"kubernetes.io/projected/8905c971-c3ea-4619-83b2-0aa95bca4bcd-kube-api-access-t45b5\") pod \"redhat-operators-pjmmh\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:13 crc kubenswrapper[4823]: I1216 09:17:13.970725 4823 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:14 crc kubenswrapper[4823]: I1216 09:17:14.419905 4823 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjmmh"] Dec 16 09:17:15 crc kubenswrapper[4823]: I1216 09:17:15.387658 4823 generic.go:334] "Generic (PLEG): container finished" podID="8905c971-c3ea-4619-83b2-0aa95bca4bcd" containerID="aca1bfd490cd8202c7e6bdadc26cf4db110150abe32089693be015f4a97aef23" exitCode=0 Dec 16 09:17:15 crc kubenswrapper[4823]: I1216 09:17:15.389533 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerDied","Data":"aca1bfd490cd8202c7e6bdadc26cf4db110150abe32089693be015f4a97aef23"} Dec 16 09:17:15 crc kubenswrapper[4823]: I1216 09:17:15.391074 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerStarted","Data":"c0aec82a43eafdf10a5bb0f1636a0ff1a8d468dd4b191f24a2e1df143d3b9f20"} Dec 16 09:17:15 crc kubenswrapper[4823]: I1216 09:17:15.773492 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:17:15 crc kubenswrapper[4823]: E1216 09:17:15.774715 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:17:17 crc kubenswrapper[4823]: I1216 09:17:17.415821 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerStarted","Data":"cc5e8e3876cba8b6c6535a075d7578005bafa4c2333a399e8a3bd527e948f61e"} Dec 16 09:17:20 crc kubenswrapper[4823]: I1216 09:17:20.447208 4823 generic.go:334] "Generic (PLEG): container finished" podID="8905c971-c3ea-4619-83b2-0aa95bca4bcd" containerID="cc5e8e3876cba8b6c6535a075d7578005bafa4c2333a399e8a3bd527e948f61e" exitCode=0 Dec 16 09:17:20 crc kubenswrapper[4823]: I1216 09:17:20.447304 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerDied","Data":"cc5e8e3876cba8b6c6535a075d7578005bafa4c2333a399e8a3bd527e948f61e"} Dec 16 09:17:21 crc kubenswrapper[4823]: I1216 09:17:21.458094 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerStarted","Data":"d3ab389c00b2bfa120a86269a9da97722eeb199970c3024cbd9be9234b3febd8"} Dec 16 09:17:21 crc kubenswrapper[4823]: I1216 09:17:21.485038 4823 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pjmmh" podStartSLOduration=2.872280488 podStartE2EDuration="8.485002161s" podCreationTimestamp="2025-12-16 09:17:13 +0000 UTC" firstStartedPulling="2025-12-16 09:17:15.391877563 +0000 UTC m=+8513.880443686" lastFinishedPulling="2025-12-16 09:17:21.004599226 +0000 UTC m=+8519.493165359" observedRunningTime="2025-12-16 09:17:21.480953635 +0000 UTC m=+8519.969519778" watchObservedRunningTime="2025-12-16 09:17:21.485002161 +0000 UTC m=+8519.973568294" Dec 16 09:17:23 crc kubenswrapper[4823]: I1216 09:17:23.971095 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:23 crc kubenswrapper[4823]: I1216 09:17:23.971487 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:25 crc kubenswrapper[4823]: I1216 09:17:25.016906 4823 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pjmmh" podUID="8905c971-c3ea-4619-83b2-0aa95bca4bcd" containerName="registry-server" probeResult="failure" output=< Dec 16 09:17:25 crc kubenswrapper[4823]: timeout: failed to connect service ":50051" within 1s Dec 16 09:17:25 crc kubenswrapper[4823]: > Dec 16 09:17:28 crc kubenswrapper[4823]: I1216 09:17:28.772589 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:17:28 crc kubenswrapper[4823]: E1216 09:17:28.773388 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:17:34 crc kubenswrapper[4823]: I1216 09:17:34.050867 4823 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:34 crc kubenswrapper[4823]: I1216 09:17:34.120095 4823 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:34 crc kubenswrapper[4823]: I1216 09:17:34.306072 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjmmh"] Dec 16 09:17:35 crc kubenswrapper[4823]: I1216 09:17:35.586554 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pjmmh" podUID="8905c971-c3ea-4619-83b2-0aa95bca4bcd" containerName="registry-server" containerID="cri-o://d3ab389c00b2bfa120a86269a9da97722eeb199970c3024cbd9be9234b3febd8" gracePeriod=2 Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.607495 4823 generic.go:334] "Generic (PLEG): container finished" podID="8905c971-c3ea-4619-83b2-0aa95bca4bcd" containerID="d3ab389c00b2bfa120a86269a9da97722eeb199970c3024cbd9be9234b3febd8" exitCode=0 Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.607542 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerDied","Data":"d3ab389c00b2bfa120a86269a9da97722eeb199970c3024cbd9be9234b3febd8"} Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.846724 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.990646 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-catalog-content\") pod \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.990754 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t45b5\" (UniqueName: \"kubernetes.io/projected/8905c971-c3ea-4619-83b2-0aa95bca4bcd-kube-api-access-t45b5\") pod \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.990818 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-utilities\") pod \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\" (UID: \"8905c971-c3ea-4619-83b2-0aa95bca4bcd\") " Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.991785 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-utilities" (OuterVolumeSpecName: "utilities") pod "8905c971-c3ea-4619-83b2-0aa95bca4bcd" (UID: "8905c971-c3ea-4619-83b2-0aa95bca4bcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:17:37 crc kubenswrapper[4823]: I1216 09:17:37.999234 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8905c971-c3ea-4619-83b2-0aa95bca4bcd-kube-api-access-t45b5" (OuterVolumeSpecName: "kube-api-access-t45b5") pod "8905c971-c3ea-4619-83b2-0aa95bca4bcd" (UID: "8905c971-c3ea-4619-83b2-0aa95bca4bcd"). InnerVolumeSpecName "kube-api-access-t45b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.093184 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t45b5\" (UniqueName: \"kubernetes.io/projected/8905c971-c3ea-4619-83b2-0aa95bca4bcd-kube-api-access-t45b5\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.093236 4823 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.123538 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8905c971-c3ea-4619-83b2-0aa95bca4bcd" (UID: "8905c971-c3ea-4619-83b2-0aa95bca4bcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.195349 4823 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8905c971-c3ea-4619-83b2-0aa95bca4bcd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.616558 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjmmh" event={"ID":"8905c971-c3ea-4619-83b2-0aa95bca4bcd","Type":"ContainerDied","Data":"c0aec82a43eafdf10a5bb0f1636a0ff1a8d468dd4b191f24a2e1df143d3b9f20"} Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.616615 4823 scope.go:117] "RemoveContainer" containerID="d3ab389c00b2bfa120a86269a9da97722eeb199970c3024cbd9be9234b3febd8" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.616618 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjmmh" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.646656 4823 scope.go:117] "RemoveContainer" containerID="cc5e8e3876cba8b6c6535a075d7578005bafa4c2333a399e8a3bd527e948f61e" Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.664303 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjmmh"] Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.669527 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pjmmh"] Dec 16 09:17:38 crc kubenswrapper[4823]: I1216 09:17:38.682738 4823 scope.go:117] "RemoveContainer" containerID="aca1bfd490cd8202c7e6bdadc26cf4db110150abe32089693be015f4a97aef23" Dec 16 09:17:38 crc kubenswrapper[4823]: E1216 09:17:38.746655 4823 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8905c971_c3ea_4619_83b2_0aa95bca4bcd.slice\": RecentStats: unable to find data in memory cache]" Dec 16 09:17:39 crc kubenswrapper[4823]: I1216 09:17:39.785188 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8905c971-c3ea-4619-83b2-0aa95bca4bcd" path="/var/lib/kubelet/pods/8905c971-c3ea-4619-83b2-0aa95bca4bcd/volumes" Dec 16 09:17:43 crc kubenswrapper[4823]: I1216 09:17:43.771974 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:17:43 crc kubenswrapper[4823]: E1216 09:17:43.774240 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:17:45 crc kubenswrapper[4823]: I1216 09:17:45.686751 4823 generic.go:334] "Generic (PLEG): container finished" podID="ee145ea2-ab11-4150-b064-795d83f416f4" containerID="b5fda3c4174361f11995c4c15e9f756265e6960dc5f08034b421e772e8164beb" exitCode=0 Dec 16 09:17:45 crc kubenswrapper[4823]: I1216 09:17:45.686867 4823 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" event={"ID":"ee145ea2-ab11-4150-b064-795d83f416f4","Type":"ContainerDied","Data":"b5fda3c4174361f11995c4c15e9f756265e6960dc5f08034b421e772e8164beb"} Dec 16 09:17:45 crc kubenswrapper[4823]: I1216 09:17:45.687465 4823 scope.go:117] "RemoveContainer" containerID="b5fda3c4174361f11995c4c15e9f756265e6960dc5f08034b421e772e8164beb" Dec 16 09:17:45 crc kubenswrapper[4823]: I1216 09:17:45.839225 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlg4h_must-gather-fwxrv_ee145ea2-ab11-4150-b064-795d83f416f4/gather/0.log" Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.631856 4823 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vlg4h/must-gather-fwxrv"] Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.632581 4823 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" podUID="ee145ea2-ab11-4150-b064-795d83f416f4" containerName="copy" containerID="cri-o://5a3e99951950de22896958abf8332090de40f4e97d792a7902e4befcf3094dc0" gracePeriod=2 Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.648148 4823 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vlg4h/must-gather-fwxrv"] Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.756696 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlg4h_must-gather-fwxrv_ee145ea2-ab11-4150-b064-795d83f416f4/copy/0.log" Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.758056 4823 generic.go:334] "Generic (PLEG): container finished" podID="ee145ea2-ab11-4150-b064-795d83f416f4" containerID="5a3e99951950de22896958abf8332090de40f4e97d792a7902e4befcf3094dc0" exitCode=143 Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.975653 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlg4h_must-gather-fwxrv_ee145ea2-ab11-4150-b064-795d83f416f4/copy/0.log" Dec 16 09:17:52 crc kubenswrapper[4823]: I1216 09:17:52.976117 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.056172 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee145ea2-ab11-4150-b064-795d83f416f4-must-gather-output\") pod \"ee145ea2-ab11-4150-b064-795d83f416f4\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.056579 4823 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79tk9\" (UniqueName: \"kubernetes.io/projected/ee145ea2-ab11-4150-b064-795d83f416f4-kube-api-access-79tk9\") pod \"ee145ea2-ab11-4150-b064-795d83f416f4\" (UID: \"ee145ea2-ab11-4150-b064-795d83f416f4\") " Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.062167 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee145ea2-ab11-4150-b064-795d83f416f4-kube-api-access-79tk9" (OuterVolumeSpecName: "kube-api-access-79tk9") pod "ee145ea2-ab11-4150-b064-795d83f416f4" (UID: "ee145ea2-ab11-4150-b064-795d83f416f4"). InnerVolumeSpecName "kube-api-access-79tk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.154034 4823 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee145ea2-ab11-4150-b064-795d83f416f4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ee145ea2-ab11-4150-b064-795d83f416f4" (UID: "ee145ea2-ab11-4150-b064-795d83f416f4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.158300 4823 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ee145ea2-ab11-4150-b064-795d83f416f4-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.158347 4823 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79tk9\" (UniqueName: \"kubernetes.io/projected/ee145ea2-ab11-4150-b064-795d83f416f4-kube-api-access-79tk9\") on node \"crc\" DevicePath \"\"" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.766291 4823 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlg4h_must-gather-fwxrv_ee145ea2-ab11-4150-b064-795d83f416f4/copy/0.log" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.767601 4823 scope.go:117] "RemoveContainer" containerID="5a3e99951950de22896958abf8332090de40f4e97d792a7902e4befcf3094dc0" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.767659 4823 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlg4h/must-gather-fwxrv" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.781401 4823 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee145ea2-ab11-4150-b064-795d83f416f4" path="/var/lib/kubelet/pods/ee145ea2-ab11-4150-b064-795d83f416f4/volumes" Dec 16 09:17:53 crc kubenswrapper[4823]: I1216 09:17:53.783888 4823 scope.go:117] "RemoveContainer" containerID="b5fda3c4174361f11995c4c15e9f756265e6960dc5f08034b421e772e8164beb" Dec 16 09:17:56 crc kubenswrapper[4823]: I1216 09:17:56.771301 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:17:56 crc kubenswrapper[4823]: E1216 09:17:56.771848 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.151787 4823 scope.go:117] "RemoveContainer" containerID="87f4e34f6b0d49eb81682e2a70ec5c56fbde9854947c3dc4f46ed10392cd9692" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.184892 4823 scope.go:117] "RemoveContainer" containerID="c445ecf1f626f495730c517c5ab9b7d408c11b0d5a67ecbb9944fb24923e46ae" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.243776 4823 scope.go:117] "RemoveContainer" containerID="dbd25f06ad8df284cbf23ab9bcd9065fe87cd340eea4b9ed12b23c03d32a3218" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.263609 4823 scope.go:117] "RemoveContainer" containerID="015ab879863fa955f1176357517ee4e9240bedcaf7dac1c3006bdf9a8fa13743" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.292729 4823 scope.go:117] "RemoveContainer" containerID="160a378f4cf5cfefa1529992f12007f187f5675511006a75337c4a426237781d" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.322279 4823 scope.go:117] "RemoveContainer" containerID="31913fa281143c1606422328777000ca5e5453f2293c31a874dc34f60925d2e3" Dec 16 09:18:01 crc kubenswrapper[4823]: I1216 09:18:01.352799 4823 scope.go:117] "RemoveContainer" containerID="bd3081ad4ea308f342c6b6e58cfb9d12ea558645b16ba2e2d7f87a7c18127a34" Dec 16 09:18:10 crc kubenswrapper[4823]: I1216 09:18:10.771993 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:18:10 crc kubenswrapper[4823]: E1216 09:18:10.772986 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:18:24 crc kubenswrapper[4823]: I1216 09:18:24.772151 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:18:24 crc kubenswrapper[4823]: E1216 09:18:24.773498 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:18:39 crc kubenswrapper[4823]: I1216 09:18:39.771561 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:18:39 crc kubenswrapper[4823]: E1216 09:18:39.773374 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:18:54 crc kubenswrapper[4823]: I1216 09:18:54.771717 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:18:54 crc kubenswrapper[4823]: E1216 09:18:54.772369 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:19:06 crc kubenswrapper[4823]: I1216 09:19:06.772343 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:19:06 crc kubenswrapper[4823]: E1216 09:19:06.773395 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:19:17 crc kubenswrapper[4823]: I1216 09:19:17.771941 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:19:17 crc kubenswrapper[4823]: E1216 09:19:17.772758 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:19:29 crc kubenswrapper[4823]: I1216 09:19:29.772183 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:19:29 crc kubenswrapper[4823]: E1216 09:19:29.773260 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:19:43 crc kubenswrapper[4823]: I1216 09:19:43.772293 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:19:43 crc kubenswrapper[4823]: E1216 09:19:43.773083 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:19:55 crc kubenswrapper[4823]: I1216 09:19:55.771695 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:19:55 crc kubenswrapper[4823]: E1216 09:19:55.772654 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:20:10 crc kubenswrapper[4823]: I1216 09:20:10.780509 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:20:10 crc kubenswrapper[4823]: E1216 09:20:10.781082 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:20:23 crc kubenswrapper[4823]: I1216 09:20:23.771647 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:20:23 crc kubenswrapper[4823]: E1216 09:20:23.772370 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:20:34 crc kubenswrapper[4823]: I1216 09:20:34.771579 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:20:34 crc kubenswrapper[4823]: E1216 09:20:34.772391 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:20:47 crc kubenswrapper[4823]: I1216 09:20:47.771804 4823 scope.go:117] "RemoveContainer" containerID="2950c2235803f3753830afebab6d921b7124da1465f4af84f0944895c27c0722" Dec 16 09:20:47 crc kubenswrapper[4823]: E1216 09:20:47.772560 4823 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fv56f_openshift-machine-config-operator(25dec47c-3043-486c-b371-2be103c214e3)\"" pod="openshift-machine-config-operator/machine-config-daemon-fv56f" podUID="25dec47c-3043-486c-b371-2be103c214e3" Dec 16 09:21:01 crc kubenswrapper[4823]: I1216 09:21:01.531347 4823 scope.go:117] "RemoveContainer" containerID="90a3999eaeab4b6ec6caa3bd3a2be96091e00d2c6ddf3179e431cd18d2a460ad"